Knowledge Base·5 min read

Knowledge base analytics - the 3 reports that actually matter

Most knowledge base dashboards show vanity metrics that look impressive but tell you nothing. Here are the 3 reports that actually drive support reduction.


Most knowledge base dashboards show vanity metrics. Total page views, average session duration, top articles - all interesting but disconnected from what matters: are customers finding answers, and are tickets going down? According to McKinsey research (2024), knowledge workers spend 19% of their time searching for information - so the question is whether your knowledge base reduces that or just adds another thing to search through. Three reports tell you the truth: zero-results searches, article feedback, and deflection rate. Here is how to read each, and what to do with the data.

Why most knowledge base reports are useless

A typical knowledge base dashboard shows: page views per article, total searches, average session duration, top categories. None of these answer the question that matters: "Did customers leave with their question answered, or did they escalate to a ticket?"

The dashboards focus on inputs (visits, searches) instead of outcomes (resolution, deflection). Three reports flip that around.

Report 1: Zero-results searches

The most undervalued report in any knowledge base tool. Zero-results searches are queries customers typed into your search box that returned no results. Each one is a content gap waiting to be filled.

What to look for:

  • Frequency. How often is each query searched?
  • Recency. Are these from this week or six months ago?
  • Specificity. "How do I cancel" suggests a missing article on cancellation. "qwerty" suggests typos.

What to do:

  • Top 5 zero-results queries this week → write articles for the next 5 days.
  • Review weekly, not monthly. Zero-results is high-value, time-sensitive feedback.
  • If a query has 50+ zero-results in a month, it should not have been zero-results - the article should have existed by week 2.

Helpable, Document360, and Zendesk Guide expose this report by default. Notion and Help Scout require workarounds.

Report 2: Article feedback (was this helpful?)

A "was this helpful?" widget at the bottom of every article asks customers to rate the answer. The aggregate score reveals which articles work and which need rewriting.

What to look for:

  • Articles with feedback score under 60% → rewrite candidates.
  • Articles with high view count but low feedback score → highly searched but unhelpful, urgent rewrites.
  • Articles with high feedback score but low view count → potentially under-promoted, fix internal links.

What to do:

  • Quarterly review of articles with feedback below 60%. Rewrite or archive.
  • For articles being rewritten, look at the comments customers left explaining why. Most common complaint: too vague, no concrete steps, missing screenshots.
  • Set a baseline. Knowledge base articles average 70-80% positive feedback when well-written.

Report 3: Deflection rate

The single most important metric for ROI: what percentage of customers who visited the knowledge base did NOT escalate to a ticket within 24 hours?

What to look for:

  • Total knowledge base visitors per period.
  • Of those, how many created a support ticket within 24 hours?
  • 1 - (ticket-creators / KB visitors) = deflection rate.

Benchmark:

  • Under 30% deflection: knowledge base is not working. Either content gaps or poor findability.
  • 30-60% deflection: average. Most teams operate here.
  • Above 60% deflection: excellent. Indicates strong content + good search + clear escalation paths.

What to do:

  • Track weekly. Sudden drops indicate either traffic source changes or content problems.
  • Slice by article. Articles with high deflection per visitor are gold; articles with low deflection per visitor signal customer frustration.
  • AI chatbots multiply deflection. Helpable Calli, Intercom Fin, and Help Scout Beacon all add their own deflection layer on top of standard knowledge base usage.

What to ignore

Five vanity metrics that mislead more than they help:

  1. Total page views. A page can be visited 10,000 times and answer 0 questions.
  2. Average session duration. Long sessions can mean engagement OR confusion.
  3. Bounce rate. A "bounce" can mean "found the answer immediately" or "this is wrong, leaving."
  4. Most visited articles. Tells you what people search for, not what works.
  5. Total searches. Inputs, not outcomes.

These have their place in marketing analytics. They do not have a place in support analytics.

How AI changes knowledge base analytics

When you add an AI chatbot, three new metrics emerge:

AI deflection rate. Of all customer questions, how many did the AI handle without escalating to a human? 40-70% is realistic with a good knowledge base.

AI confidence distribution. What percentage of AI answers were high-confidence vs low-confidence vs escalated? Low confidence answers should be rare - if many, your knowledge base needs more content.

AI vs human deflection comparison. Are customers more satisfied with AI answers or human answers? Helpable, Intercom, and Help Scout track this; most other tools do not.

Setting up a useful dashboard

A practical knowledge base dashboard has just six widgets:

  1. Zero-results queries this week (top 10)
  2. Article feedback distribution (% under 60%)
  3. Deflection rate trend (last 90 days)
  4. AI deflection rate (if applicable)
  5. Top low-feedback / high-traffic articles (rewrite candidates)
  6. New articles published this month (output tracking)

Anything beyond this is decoration. Most knowledge base tools include all six by default - if yours does not, switch.

Frequently asked questions

What is a good deflection rate for a SaaS knowledge base? 30-60% is typical. Above 60% is excellent. Below 30% suggests serious content gaps or poor findability.

How often should I review knowledge base analytics? Zero-results: weekly, 15 minutes. Feedback: monthly, 1 hour. Deflection trend: monthly, 30 minutes.

Which tools have the best built-in analytics? Helpable, Document360, and Zendesk Guide expose all three essential reports by default. Help Scout has decent analytics on Standard plan and above. Notion lacks dedicated KB analytics entirely.

Can I track knowledge base analytics with Google Analytics? For page views and search queries: yes. For deflection rate (ticket creation correlation): no - GA does not connect to your support tool. You need a knowledge base tool with native deflection tracking.

How do I measure AI deflection separately from total deflection? Helpable's dashboard separates AI-handled vs human-handled vs escalated. Intercom does this for Fin AI. Most other AI tools combine the metrics, making it harder to evaluate AI specifically.

Next step

Want a knowledge base with built-in zero-results, feedback, and deflection analytics for a flat monthly price? Try Helpable free for 7 days. $149/month flat with AI included.

More on this topic:

Sources: McKinsey research 2024 on knowledge workers, Gartner Knowledge Management Software Reviews 2026.

Last updated: May 2026.

Ready to reduce support tickets?

Build a help center that answers questions before they become tickets. Free plan available.