How We Measure Success: Transparent KPIs at Aditya Labs
B Mohan
Published February 20, 2026 · Updated February 20, 2026 · 6 min read
Most SaaS companies publish flashy numbers on their websites: "10,000 users!" "500% growth!" "Millions of messages processed!" But what do those numbers actually mean? Are those paying users or free signups? Is that growth in revenue or just in email addresses collected?
At Aditya Labs, we believe that if you are going to publish metrics, they should be the ones that actually matter to your experience as a customer. This post explains the five KPIs we track internally, why we chose them, and where we stand today.
The 5 KPIs We Track
### 1. Platform Uptime
**What it measures:** The percentage of time our platform is operational and accessible to your AI agents.
**Our target:** 99.9% uptime, which allows for approximately 8.7 hours of downtime per year.
**Why it matters:** If our platform is down, your AI agent is offline and your customers see no response. Uptime is not a vanity metric — it directly impacts your business.
**Industry context:** According to Gartner, the average cost of IT downtime across industries is approximately $5,600 per minute. For small businesses relying on our platform for customer engagement, even brief outages can mean missed leads and frustrated customers.
**Where we stand:** As a newly launched platform, we are publishing our uptime data monthly on our status page. We do not yet have a full year of uptime history to report, and we will not pretend otherwise.
### 2. Response Accuracy (RAG Grounding)
**What it measures:** The percentage of AI agent responses that are grounded in the customer's actual knowledge base content, as opposed to hallucinated or fabricated answers.
**Our target:** Greater than 95% grounded responses during conversations where relevant knowledge base content exists.
**Why it matters:** An AI agent that makes things up is worse than no AI agent at all. If a dental practice's AI tells a patient the wrong office hours or invents a service that does not exist, that destroys trust.
**How we measure it:** Our Retrieval-Augmented Generation (RAG) system retrieves relevant documents from the knowledge base before generating a response. We track what percentage of responses are directly supported by retrieved content. When no relevant content exists, the agent is instructed to say it does not know and offer to connect the visitor with a human.
**Where we stand:** We are currently measuring grounding rates across our internal test suites. We will publish aggregate accuracy data once we have a statistically meaningful sample from production usage.
### 3. Customer Satisfaction (Net Promoter Score)
**What it measures:** How likely our customers are to recommend Aditya Labs to a colleague or peer.
**Our target:** NPS of 50 or above, which Bain & Company (the creators of NPS) considers "excellent" in the SaaS industry.
**Why it matters:** NPS captures overall customer sentiment in a single number. According to Bain & Company's original research, companies with the highest NPS in their industry tend to grow at more than twice the rate of competitors.
**Industry context:** According to data published by Retently, the average NPS for the SaaS industry falls between 30 and 40. We are setting our target above that average because we believe our transparency-first approach should translate to higher trust and satisfaction.
**Where we stand:** We are a new company. We do not yet have enough customers to report a statistically meaningful NPS. We will publish our first NPS data once we reach a sufficient sample size. We refuse to publish an NPS based on a handful of responses and call it representative.
### 4. Support Response Time
**What it measures:** The time between a customer submitting a support request and receiving a substantive response from our team (not an auto-reply).
**Our target:** Less than 4 hours during business hours, less than 12 hours outside business hours.
**Why it matters:** According to SuperOffice's customer service benchmark report, the average response time across industries is approximately 12 hours, with many companies taking over 24 hours. We want to significantly outperform this average because we are a customer-facing AI company — if we cannot respond quickly to our own customers, how can we credibly promise that for theirs?
**Where we stand:** We currently have a small team, which means response times can vary. We track every support interaction and will publish average response times quarterly once we have sufficient data.
### 5. Honest Marketing Compliance
**What it measures:** An internal audit score reflecting whether our marketing materials, blog posts, landing pages, and sales communications meet our honesty standards.
**Our target:** 100% compliance. Every claim sourced, every limitation disclosed, no fabricated testimonials.
**Why it matters:** The Edelman Trust Barometer 2024 found that 63% of consumers will buy from brands they trust, even over brands they like more. Trust is not built through clever marketing — it is built through consistent honesty. We audit our own materials regularly against a checklist that includes: Are all statistics sourced? Are limitations clearly stated? Are customer results verified? Is pricing transparent?
**Where we stand:** We conduct this audit internally before publishing any new content. We have not yet engaged a third-party auditor, but we plan to do so as we scale.
Why Transparent Metrics Matter
The SaaS industry has a measurement problem. According to data published by ProfitWell, the median SaaS net revenue retention rate falls in the range of 100% to 110%. Yet many companies report only gross metrics that obscure churn and contraction revenue.
When companies hide behind vanity metrics, customers lose the ability to make informed decisions. You deserve to know whether the company you are paying is actually delivering results — not just growing its email list.
Publishing real KPIs holds us accountable. If our uptime drops below target, you will know. If our NPS falls, we will report it. This transparency creates a feedback loop that forces us to improve.
What We Will Never Do
We are drawing a clear line on metric practices we consider dishonest:
Our Commitment Going Forward
We will publish a quarterly transparency report that includes our performance against each of these five KPIs. We will celebrate improvements and acknowledge setbacks honestly.
If we ever fall short of our targets, we will explain what happened, what we learned, and what we are doing about it. We believe this level of accountability is rare in our industry — and that it should not be.
Sources
B Mohan
Founder, Aditya Labs
Founder of Aditya Labs. Building AI-powered customer service tools to help small businesses capture every lead and never miss a customer inquiry. Based in Watford, UK.
Ready to build your AI agent?
Start free. No credit card required. Simple setup — no coding needed.
Get Started Free