Metric Glossary
A definitive handbook of product metrics. Understand the math, the meaning, and the common pitfalls before you optimize.
DAU
growthDaily Active Users
Count(Unique Users with > 0 Sessions in 24h)- Avoid vanity: Define "Active" strictly. Viewing a page might not be enough; performing a core action is better.
- Fluctuations: DAU can be volatile on weekends vs weekdays.
MAU
growthMonthly Active Users
Count(Unique Users with > 0 Sessions in 30 days)- Lagging indicator: A drop in MAU is often noticed too late.
- Good for high-level health checks, but less actionable than DAU or WAU.
WAU
growthWeekly Active Users
Count(Unique Users with > 0 Sessions in 7 days)- Ideal for B2B products where usage is consistently weekly but not necessarily daily.
- Less volatile than DAU, more actionable than MAU.
Active Hosts
growthActive Hosts (Marketplace)
Count(Hosts with > 0 Active Listings or Sessions)- Supply constraints: In marketplaces, supply is often the constraint. Tracking active hosts is as critical as active buyers.
- Quality over Quantity: 10 active super-hosts are better than 100 inactive ones.
New Host Signups / Signups
growthNew User Signups
Count(Users Registering Account)- Top of Funnel: High signups with low activation is a sign of poor targeting or onboarding, not success.
- Spam Risk: Be aware of bot signups inflating this number.
Activation Rate
growthUser Activation Rate
(Users who performed Activation Event / Total New Users) * 100- Critical metric: Signups without activation is a leaky bucket.
- Time-bound: Always define a window (e.g. Activated within 7 days).
Subscriber Base
growthTotal Subscribers
Count(Users with status = "active")- Paused vs Churned: Decide how to handle paused subscriptions. They are not active, but not fully churned.
- Tier Mix: Track the breakdown of Basic vs Premium subscribers.
Organic Traffic
growthOrganic User Traffic
Count(Sessions where Source = Organic Search)- SEO Health: Primary indicator of SEO performance.
- Brand Strength: High direct/organic traffic often signals strong brand recall.
Viral Coefficient (K-factor)
growthViral Coefficient
Avg # of Invitations sent * Conversion rate of Invitations- K > 1: Exponential growth (Viral).
- K < 1: Linear or decaying growth (Paid/Organic dependent).
CPI
growthCost Per Install
Total Ad Spend / Total Installs- Platform Variance: CPI on iOS is typically higher than Android.
- Fraud: Watch out for install farms inflating numbers.
PMF
generalProduct-Market Fit
Qualitative (Sean Ellis Test: >40% would be "very disappointed" without it)- Sustainable growth: PMF is the precursor to scaling. Don't pour fuel on a product without PMF.
- Retention as proxy: High long-term retention is the best quantitative indicator of PMF.
Aha! Moment
growthAha! Moment
User performs [Action] + [Frequency] within [Time Period]- Facebook: 7 friends in 10 days.
- Slack: 2,000 messages sent within a team.
- Dropbox: 1 file in 1 folder on 1 device.
Referral Rate
growthReferral Rate
(Number of Referrals / Total Customers) * 100- Advocacy: High referral rate indicates strong product love.
- Incentives: Track organic vs incentivized referrals separately.
PQL
growthProduct Qualified Leads
Users hitting Activation criteria- Sales alignment: PQLs convert much higher than MQLs (Marketing Qualified Leads).
- Usage based: Defined by actual product usage, not just demographic fit.
ARR
revenueAnnual Recurring Revenue
Sum(Annual Subscription Value of all active customers)- Don't include one-time fees (consulting, setup).
- Committed vs Contracted: Use committed revenue for accurate forecasting.
GMV
revenueGross Merchandise Value
Sales Price * Number of Items Sold- Revenue is separate: GMV is not your revenue; it is the throughput. Revenue is GMV * Take Rate.
- Returns: Gross GMV usually includes returns; Net GMV excludes them.
GMS
revenueGross Merchandise Sales
Sum(Sales Volume)- Marketplace Context: Often used interchangeably with GMV but specifically refers to the sales volume processed.
- Net GMS: Sales after cancellations and returns.
Take Rate
revenueTake Rate / Commision Rate
(Revenue / GMV) * 100- Marketplace Health: High take rate could drive sellers away; low take rate might make the business unsustainable.
- Components: Often includes commission + payment fees + advertising revenue.
TPV
revenueTotal Payment Volume
Sum(Value of all processed transactions)- Metric for Fintech: Similar to GMV for e-commerce. Indicates scale/throughput.
- Currency fluctuations: Be careful when aggregating TPV across multiple currencies.
AOV
revenueAverage Order Value
Total Revenue / Total Number of Orders- Pricing Strategy: Increasing AOV is often easier than getting new customers (bundling, upsells).
- Beware of outlines: One whale buyer can skew the average. Check the median too.
ARPU
revenueAverage Revenue Per User
Total Revenue / Total Active Users- Segementation key: ARPU varies wildly by cohort (e.g. Free vs Pro). Always segment.
- Vanity warning: High ARPU with shrinking user base is a warning sign of a niche product.
LTV
revenueLifetime Value
(ARPU * Gross Margin %) / Churn Rate- Prediction hazard: LTV is a forecast, not a fact. Be conservative with churn assumptions.
- LTV:CAC Ratio: Ideally 3:1. If 1:1, you are bleeding money. If 5:1, you are under-investing in growth.
CAC
revenueCustomer Acquisition Cost
(Sales + Marketing Expenses) / Number of New Customers Acquired- Blended vs Paid: Blended CAC hides the inefficiency of paid channels. Always calculate Paid CAC separately.
- Time lag: Marketing spend today might not produce users for months (in B2B).
LTV:CAC
revenueLTV to CAC Ratio
LTV / CAC- Benchmark: 3:1 is the gold standard for SaaS.
- Efficiency: If > 5:1, you are likely under-spending on growth.
CLV
revenueCustomer Lifetime Value
Avg Order Value * Purchase Frequency * Customer Lifespan- CLV vs LTV: LTV often refers to Gross Margin contributed, while CLV can refer to pure revenue.
- Cohort tracking: Track CLV by signup month to see if customer quality is improving.
CAC Payback Period
revenueCAC Payback Period
CAC / (ARPU * Gross Margin %)- Cash efficiency: For startups, < 12 months is the goal. > 18 months creates cash flow drag.
- Segment: SMB payback should be faster (6-9mo) than Enterprise (12-18mo).
Gross Margin
revenueGross Margin
((Revenue - COGS) / Revenue) * 100- SaaS scalability: High gross margins (80%+) explain why software companies are valued highly.
- COGS: Includes hosting, support, and payment fees, but not R&D.
EBITDA
revenueEBITDA
Net Income + Interest + Taxes + Depreciation + Amortization- Valuation: Common valuation metric for mature companies.
- Rule of 40: Growth Rate + EBITDA Margin should be > 40%.
Win Rate
revenueSales Opportunity Win Rate
(Closed Won Deals / Total Opportunities) * 100- Pipeline Quality: Low win rate usually means poor lead qualification, not just bad sales skills.
- Stage analysis: Track conversion rate per stage to find the bottleneck.
Pipeline Value
revenueSales Pipeline Value
Sum(Opportunity Value * Probability to Close)- Forecast accuracy: Weighted pipeline is only as good as the probability estimates.
- Stale deals: Remove deals that have been "open" too long to keep pipeline realistic.
New Business Revenue
revenueRevenue from New Customers
Sum(Revenue from Customers with StartDate in Period)- Hunter vs Farmer: Measures the effectiveness of your sales "hunters".
- Discounting impacts: Watch out for high new biz rev driven by unsustainable discounts.
Expansion Revenue
revenueExpansion Revenue
Sum(Upsell + Cross-sell Revenue)- Cheaper growth: Expanding existing customers is 5-25x cheaper than acquiring new ones.
- Net Retention driver: This is the fuel for NDR > 100%.
Burn Rate
revenueNet Burn Rate
Cash Spent - Cash Revenue (per month)- Runway calculator: Cash Balance / Burn Rate = Months of Runway.
- Efficiency: High burn is acceptable only if growth is efficiently high (Magic Number > 1).
Runway
revenueCash Runway
Current Cash Balance / Monthly Net Burn Rate- Survival metric: 18-24 months is standard for venture-backed startups.
- Fundraising trigger: Usually need to start raising when runway < 6-9 months.
Magic Number
revenueSaaS Magic Number
(Current Q Revenue - Previous Q Revenue) * 4 / Previous Q Sales & Marketing Spend- > 1.0: Efficient growth. Pour more fuel on the fire.
- < 0.7: Inefficient. Fix the funnel before spending more.
MRR
revenueMonthly Recurring Revenue
Sum(Monthly Subscription Fees)- Momentum: The most important metric for SaaS growth.
- Commitment: Includes recurring charges, excludes one-time fees.
Net Revenue Churn
revenueNet Revenue Churn
((Revenue Lost from Churn - Expansion Revenue) / Starting Revenue) * 100- Negative Churn: If Expansion > Churn, you have Negative Net Churn (the holy grail).
- Sustainability: High net churn kills growth.
Quick Ratio
revenueSaaS Quick Ratio
(New MRR + Expansion MRR) / (lost MRR + Contraction MRR)- > 4: Excellent growth efficiency.
- < 2: You are burning cash just to replace lost customers.
Rule of 40
revenueRule of 40
Annual Revenue Growth Rate % + EBITDA Margin %- Trade-off: You can be unprofitable (-10%) if you are growing fast (50%).
- Balance: High growth permits lower margins, and vice versa.
Burn Multiple
revenueBurn Multiple
Net Burn / Net New ARR- < 1.0: Amazing efficiency.
- > 3.0: Warning sign, spending too much for too little growth.
Churn Rate
retentionCustomer Churn Rate
(Lost Customers / Total Customers at Start of Period) * 100- Silent killer: Low churn is often more important than high growth for sustainability.
- Revenue Churn vs Logo Churn: You can lose customers but grow revenue (negative churn) if upsells are strong.
NDR / NRR
retentionNet Dollar Retention
((Review Start Rev + Expansion - Churn - Contraction) / Start Rev) * 100- >100% is the holy grail: It means you grow even if you acquire zero new customers.
- Cohort analysis: Analyze NDR by cohort to see if older cohorts expand or shrink over time.
Repeat Purchase Rate
retentionRepeat Customer Rate
(Customers with > 1 Order / Total Unique Customers) * 100- E-commerce vital sign: Breaking even on first purchase is rare; profit comes from repeats.
- Window matters: Define the window carefully (e.g. repeat within 90 days).
Stickiness
retentionStickiness Ratio (DAU/MAU)
(DAU / MAU) * 100- World class: > 20% is good; > 50% is Facebook level.
- Utility drift: Only relevant for daily-use products.
Sessions per User
engagementAverage Frequency of Use
Total Sessions / Total Active Users- Habit Strength: High frequency often indicates strong habit formation.
- Natural Frequency: Don't expect daily use for a monthly utility app (e.g. Payroll).
Time Spent
engagementAverage Time Spent per User
Total Session Duration / Total Active Users- Quality vs Quantity: Time spent is good for media (Netflix), but bad for utility (Uber).
- Active vs Idle: Ensure you are tracking active foreground time.
Content Consumption
engagementContent Consumption
Sum(Content Units Consumed)- Feed Health: Critical for media/social apps.
- Completion Rate: Watching 5% of a video counts as a view, but completion is a better signal of quality.
Content Engagement
engagementEngagement with Content
Sum(Likes + Comments + Shares)- Algorithm signal: These are high-value signals for recommendation algorithms.
- Share > Like: A share is usually a much stronger endorsement than a like.
Content Creation / Files Created
engagementUser Generated Content Volume
Count(New Core Entities Created)- Prosumer Activity: Creation is often the first step to collaboration/sharing.
- Empty states: Watch out for "Created but empty" files.
Messages Sent
engagementCommunication Volume
Count(Messages Successfully Sent)- Network Density: More messages usually means a denser, stickier network.
- Bot traffic: Exclude automated system messages.
Project Activity
engagementProject Management Activity
Count(Task Updates + Comments + Transitions)- Work vs Noise: Are users just moving cards around, or actually closing them?
- Collaboration proxy: High activity implies the team is living in the tool.
Meeting Volume
engagementMeeting Activity
Count(Meetings with > 1 participant)- Cost metric: For internal tools, high meeting volume might be bad (inefficiency).
- Value metric: For Zoom/Teams, this is the core value unit.
Bounce Rate
engagementWebsite Bounce Rate
(Single Page Sessions / Total Sessions) * 100- Content relevance: High bounce rate often means the landing page didn't match the ad promise.
- Exceptions: Single-page apps (SPAs) or simple informational queries might have naturally high bounce rates.
Cart Abandonment
engagementCart Abandonment Rate
((Initiated Carts - Completed Transactions) / Initiated Carts) * 100- Friction point: Often caused by surprise costs (shipping, tax) at checkout.
- Retargeting: Primary trigger for abandoned cart emails.
NPS
generalNet Promoter Score
% Promoters (9-10) - % Detractors (0-6)- Sentiment vs Reality: Users may say they love you (high NPS) but still churn.
- Use strictly as a pulse check, not a root cause diagnostic tool.
Conversion Rate
generalFunnel Conversion Rate
(Users entering Stage B / Users entering Stage A) * 100- Micro-conversions: Don't just track the final sale; track each step to find the friction.
- Local maxima: optimizing one step (e.g. clicks) might hut the next down stream retention.
Resolution Time
generalAverage Resolution Time
Average(Resolved Date - Created Date)- Customer Satisfaction (CSAT): Long resolution times kill CSAT.
- SLA Breaches: Track % of tickets resolved within SLA, not just average.
Cycle Time
generalDevelopment Cycle Time
Average(Completion Timestamp - Start Timestamp)- Flow efficiency: High cycle time often indicates blocking dependencies or context switching.
- Predictability: Stable cycle time is better than fast but erratic cycle time.
User Efficiency
generalUser Workflow Efficiency
Average(Time to Complete Core Action)- Friction hunting: Use this to identify UX hurdles.
- Learning curve: Efficiency should improve as user tenure increases.
Supply Liquidity
generalMarketplace Supply Liquidity
(Transactions / Active Listings) * 100- Marketplace Health: High liquidity means buyers always find what they want.
- Cold start: Hardest metric to move in a new marketplace.
SUS
generalSystem Usability Scale
Survey Score (0-100)- Industry standard: Score > 68 is considered above average.
- Quick pulse: "I thought the system was easy to use."
CES
generalCustomer Effort Score
Survey: "How easy was it to handle your issue?"- Loyalty predictor: High effort correlates strongly with churn.
- Frictionless: The goal is to make interactions effortless.
Task Success Rate
generalTask Success Rate
(Successful Completions / Total Attempts) * 100- Usability validation: If users can't finish the task, the design failed.
- Severity: Distinguish between "gave up" vs "completed with errors".
Time on Task
generalTime on Task
End Time - Start Time- Efficiency: Generally lower is better for utility tasks.
- Benchmarking: Compare against expert users or previous versions.
NSM
generalNorth Star Metric
Context Dependent (Unique to each product)- Not Revenue: Revenue is a lagging indicator of value. The North Star should reflect value received.
- Focus alignment: It serves to align the entire company towards a singular product goal.
Input Metric
generalInput Metric
Actionable & Controllable by teams (e.g. Inventory growth, Latency reduction)- Actionable: Teams should be able to influence these directly.
- Hypothesis-driven: You believe moving these will eventually move the North Star.
Output Metric
generalOutput Metric
Result-oriented (e.g. Revenue, Total Churn)- Lagging: By the time you see them move, the activities that caused the move are already over.
- Hard to influence directly: You move them by moving input metrics.
Counter Metric
generalCounter Metric
Dependency check (e.g. If NSM is Orders, Counter might be Order Cancellations)- Checks & Balances: Prevents teams from "gaming" the system (e.g. increasing signups but hurting quality).
- Health check: If the counter metric tanks, your NSM growth might be hollow.
OKR
generalObjectives and Key Results
Objective (Vision) + 3-5 Key Results (Measurable Metrics)- Ambitious: Objectives should be qualitative and inspiring.
- Quantitative: Key Results must be numbers (e.g. Grow MAU from 1M to 1.5M).
KPI
generalKey Performance Indicator
Business Vital Signs- Steady state: KPIs are often monitored continuously to ensure business health.
- NSM vs KPI: A North Star is a type of KPI, but usually the most important one.
MVP
generalMinimum Viable Product
Leanest version for Learning- Learning tool: The goal is to test hypotheses, not just build a "cheap" version.
- Viable: It must still solve the core problem for the user.
PRD
generalProduct Requirements Document
The "What" and "Why" of a feature/product- Alignment: Ensures that everyone on the team knows what is being built and for whom.
- Living document: Should be updated as requirements evolve.
User Persona
generalUser Persona
Demographics + Behaviors + Pain Points + Goals- Empathy: Helps the team design with a specific human in mind.
- Segmentation: Different personas might have different North Star Metrics if they find different value.
Adoption Rate
growthFeature Adoption Rate
(Users of Feature / Total Eligible Users) * 100- Discovery vs Utility: Low adoption could mean they can't find it (Discovery) or they don't need it (Utility).
- Depth: Adoption is binary; also track frequency/depth of use.
TTV
growthTime to Value
Average(Activation Timestamp - Signup Timestamp)- Onboarding velocity: Faster TTV usually leads to higher retention.
- Complex products: For B2B, TTV might be measured in weeks; for B2C, in minutes.
Guardrail Metric
generalGuardrail Metric
Latency, Error Rate, Cancellation Rate- Safety first: If a guardrail metric violates a threshold, stop the experiment immediately.
- Counter Metric vs Guardrail: Counter metrics are often business-oriented (Profit vs Revenue); Guardrails are often technical (Latency, Crashes).
Deployment Frequency
generalDeployment Frequency
Count(Deployments) / Time Period- Velocity: High frequency correlates with high-performing teams.
- Batch size: Encourages smaller, safer releases.
Lead Time for Changes
generalLead Time for Changes
Deployment Timestamp - Commit Timestamp- Agility: Lower lead time means faster feedback loops.
- Bottlenecks: Highlights delays in QA or CI/CD pipelines.
MTTR
generalMean Time to Recovery
Sum(Down time) / Count(Incidents)- Resilience: It is not about never failing, but recovering fast.
- SLA: Critical for enterprise contracts.
Change Failure Rate
generalChange Failure Rate
(Failed Deployments / Total Deployments) * 100- Stability: High velocity is useless if quality is low.
- Balance: Aim for speed without breaking things.
Match Rate
generalMatch Rate
(Successful Matches / Total Searches or Requests) * 100- Liquidity quality: A high match rate ensures users don't leave empty-handed.
- Zero results: Track "Zero Result Searches" closely.
Search to Fill
generalSearch to Fill Rate
(Transactions / Total Searches) * 100- Relevance: Indicates how well the inventory matches user intent.
- Pricing: Also affected by price matching.
Buyer/Seller Overlap
generalBuyer/Seller Overlap
(Users who Buy AND Sell / Total Users) * 100- Network effects: High overlap (e.g. Poshmark, Airbnb) creates a powerful, self-sustaining ecosystem.
- Acquisition: Acquiring one user gets you both supply and demand.
Crash-Free Users
generalCrash-Free Users Rate
((Total Users - Users with Crash) / Total Users) * 100- Stability: Target > 99.9% for high-quality apps.
- Retention killer: Crashes are the fastest way to lose a mobile user.
ANR Rate
generalApplication Not Responding Rate
(ANR Sessions / Total Sessions) * 100- Frustration: Worse than a crash because the user waits.
- Performance: Often tied to main thread blocking.
App Store Conversion
generalApp Store Conversion Rate
(Installs / Page Views) * 100- ASO: Optimize screenshots, reviews, and description.
- First impression: This is the landing page for mobile.
DAP / MAP
growthDaily/Monthly Active People
Count(Unique Physical Humans Active)- De-duplication: Requires advanced identity matching to link devices/accounts.
- Family Metrics: Used to aggregate usage across a family of apps (FB, Insta, WhatsApp).
Resurrection Rate
retentionUser Resurrection Rate
(Resurrected Users / Total Dormant Users) * 100- Win-back: Measures the effectiveness of re-engagement campaigns (email, push).
- Cheaper than new: Resurrecting a user is often cheaper than acquiring a completely new one.
Fulfillment Rate
generalFulfillment Rate
(Completed Orders / Total Orders Placed) * 100- Marketplace Trust: Low fulfillment kills trust on the demand side.
- Supply issue: Often indicates lack of supply liquidity or operational failures.
Streak
engagementUser Streak Length
Count(Consecutive Active Periods)- Gamification: The most powerful retention mechanic in EdTech (Duolingo) and Health.
- Loss Aversion: Users return just to "save the streak".
KYC Success Rate
generalKYC Pass Rate
(Verified Users / Total KYC Attempts) * 100- Funnel Blocker: KYC is the biggest drop-off point in Fintech onboarding.
- Fraud balance: Too easy = fraud; Too hard = lost users.
Items per Basket
revenueItems Per Basket (UPT)
Total Items Sold / Total Transactions- Bundling proxy: High UPT usually means successful cross-selling.
- Inventory efficiency: Helps move more SKUs per logistics cost unit.
Unit Sales
revenueUnit Sales
Count(Items Sold)- Hardware North Star: For Apple/Tesla, units sold is the primary measure of scale.
- Install Base: Units sold accumulates into Active Installed Base.
Lessons Completed
engagementLessons Completed
Count(Completed Lessons)- Value metric: For EdTech, this is the core value exchange.
- Paywall trigger: Often used as a limit before requiring payment.
Online Hours
growthProvider Online Hours
Sum(Time Available Online)- Supply Capacity: The raw capacity of a service marketplace.
- Peak vs Off-peak: Utilization rate is Online Hours / Busy Hours.
Health Logs
engagementActivities/Meals Logged
Count(User Logs)- Active Input: Requires high effort; indicators of high intent.
- Retention: Logging is usually the primary habit loop in health apps.
Velocity
generalSprint Velocity
Sum(Story Points of Completed User Stories in a Sprint)- Not a comparison tool: Velocity is unique to each team. Do not compare Team A's velocity to Team B's.
- Predictability: The goal is a stable velocity, not an constantly increasing one.
Lead Time
generalProduct Lead Time
Average(Delivery Timestamp - Creation Timestamp)- Customer Centric: This is the time the customer actually experiences waiting.
- Cycle vs Lead: Cycle time is when dev starts; Lead time is from when the ticket is created.
WSJF
generalWeighted Shortest Job First
Cost of Delay / Job Size- Cost of Delay = User-Business Value + Time Criticality + Risk Reduction/Opportunity Enablement.
- Job Size = Effort/Duration. Doing high-value, easy things first wins.
Adoption Rate
growthFeature Adoption Rate
(Users who used Feature X / Total Active Users) * 100- Time-bound: Measure adoption at 1 day, 7 days, and 30 days post-launch.
- Depth vs Breadth: Adoption rate measures breadth (how many tried it), not depth (how often they use it).
TTV
engagementTime To Value
Average(Time of Aha Moment - Time of Signup)- Onboarding: Shortening TTV is the primary goal of user onboarding.
- B2B vs B2C: TTV in B2C is often measured in minutes; in Enterprise B2B, it can be weeks/months.