Dashboards are seductive. A fresh template, a row of sparklines, and suddenly you feel data-driven. Then three weeks later the dashboard goes stale, nobody opens it, and the team is back to guessing. The problem isn’t the dashboard — it’s what came before it. Before you build one, ask the four questions every metric should answer. Any metric that can’t answer all four doesn’t belong on your dashboard.
I started using this filter after a client showed me a beautiful 14-tile dashboard that nobody used. When I asked which decisions each tile was supposed to inform, the CEO couldn’t name one for nine of them. The dashboard was decoration. If you’ve ever built — or been handed — a dashboard that nobody consults, this article is for you.

Why Most Dashboards Fail
A dashboard fails for one of three reasons: the metrics on it are wrong, the audience isn’t clear, or the action isn’t defined. Most failures combine all three. Teams add tiles because “it seemed useful,” nobody owns a tile, and when a number moves, nobody knows what to do.
The four-question filter I’ll share cuts that failure rate dramatically. In my experience, applying it honestly removes half the tiles from a typical dashboard. The remaining half becomes the dashboard you’ll actually check.
Question 1: What Decision Will Change Based on This Number?
This is the filter question. If you can’t name a decision the metric will inform, the metric is decoration. For instance, if you track “daily sessions,” ask yourself: what will I do differently if it’s 1,200 versus 1,400? If the answer is “nothing” or “I’d need more context,” the metric isn’t earning its place.
| Metric | Decision It Should Inform | Earns Its Place? |
|---|---|---|
| Weekly new revenue | Whether to keep current acquisition spend | Yes |
| Total pageviews | (none by itself) | No |
| Trial-to-paid conversion | Whether to iterate on onboarding | Yes |
| Average session duration | (rarely, without context) | Usually not |
| Cost per acquired customer | Whether channel mix needs rebalancing | Yes |
Notice the pattern. Decision-linked metrics survive; context-free metrics don’t. Therefore, before you add a tile, write the decision it enables on a sticky note next to it. If you can’t write one, cut it.
Question 2: Who Owns the Number When It Moves?
Every metric needs exactly one human owner. Not a team. Not “marketing.” A name. When the number goes sideways, that person investigates and reports back. No owner means no accountability, which means no follow-through.

This sounds obvious until you try it. For many small teams, the same person ends up owning eight metrics, which means they own none. The exercise of naming owners surfaces over-concentration and forces the team to distribute responsibility realistically.
- Marketing owns: acquisition cost, conversion rate, traffic quality
- Product owns: activation rate, feature adoption, onboarding completion
- Customer success owns: churn rate, expansion revenue, support load
- Finance owns: revenue totals, margin, runway
If two people claim the same metric, you have a definition problem — they’re probably looking at different calculations. Resolve that before moving on; otherwise the dashboard will host an ongoing turf war.
Question 3: What’s the Baseline, and What Counts as Alarming?
A number in isolation is meaningless. “We had 847 signups last week” tells you nothing without the baseline. Is that good? Bad? Normal? Without a baseline and an alarm threshold, every Monday meeting becomes a vibes-based discussion.
For each metric, document three levels: the historical average over the last 13 weeks, the floor below which you investigate, and the ceiling that triggers a “why are we winning?” conversation. Yes, good news deserves investigation too. Spikes you don’t understand are risks, not wins.
| Metric | Baseline (13-week avg) | Floor | Ceiling |
|---|---|---|---|
| Weekly signups | 680 | 500 | 900 |
| Trial-to-paid % | 18% | 14% | 25% |
| Cost per signup | €6.20 | €8.50 | €4.00 |
These numbers live next to each metric on the dashboard. Now when the week ends, anyone can glance and instantly classify: healthy, investigate, or celebrate. Consequently, the dashboard becomes self-service instead of requiring a weekly explanation meeting.
Related: The three revenue metrics every small team should track weekly covers the specific metrics I’d apply this filter to first.
Question 4: Can You Explain How It’s Calculated in One Sentence?
If the calculation takes a paragraph, the metric is too complex for a dashboard. Complex metrics are fine for deep analysis, but a weekly glance needs definitional clarity. Otherwise people fight about numbers instead of acting on them.
Write the one-sentence definition in plain English. “Weekly new revenue = sum of first invoices paid between Monday and Sunday, excluding renewals and upgrades.” Clear. Testable. Hard to misinterpret.
- Good: “Conversion rate = paid signups ÷ trial signups, measured over same 14-day window.”
- Bad: “Conversion rate = overall performance of our funnel adjusted for quality.”
- Good: “Active customers = accounts with at least one login in the last 28 days.”
- Bad: “Active customers = engaged users based on composite engagement score.”
In other words, if the definition needs an explainer, pick a different metric. Dashboards are for monitoring, not for interpreting. Save the interpretation work for the analysis tools, where you have space to explain the math.
A Worked Example: Auditing an Existing Dashboard
Let me walk through a real audit I ran last quarter. The team had 14 tiles. I asked the four questions for each tile. Here’s what survived:

| Original Tile | Q1: Decision? | Q2: Owner? | Q3: Baseline? | Q4: Definable? | Verdict |
|---|---|---|---|---|---|
| Weekly new revenue | Yes | Yes | Yes | Yes | Keep |
| Total page views | No | No clear | No baseline | Yes | Cut |
| Avg session time | Unclear | Marketing? | No | Yes | Cut |
| Trial-to-paid % | Yes | Yes | Yes | Yes | Keep |
| Organic impressions | No | SEO? | No | Yes | Cut |
| Customer acquisition cost | Yes | Yes | Yes | Yes | Keep |
| Social media followers | No | No | No | Yes | Cut |
The final dashboard had six tiles, down from fourteen. Team usage jumped from “maybe once a month” to “every Monday morning.” The less there is to look at, the more people look at it.
When to Break the Rules
Two exceptions are worth naming. First, context tiles — metrics that don’t trigger decisions but give interpretive color to other metrics. A “total active accounts” number might belong next to “weekly churn” to put the churn in scale. Keep these, but mark them explicitly as context, not action.
Second, stakeholder tiles — metrics your board or investors ask for that you don’t use operationally. Meanwhile, don’t fight the requirement, but segregate them from your team’s working dashboard so they don’t dilute attention.
Applying This Without Fighting Your Team
If you inherit a bloated dashboard, don’t announce “I’m cutting this.” Instead, run the four questions in a 30-minute workshop with whoever currently uses (or ignores) the dashboard. Let the answers speak. When people can’t answer Q1 for their favorite tile, they’ll usually concede the cut themselves.
- List every metric on the current dashboard
- Walk through the four questions per metric with the relevant owner
- Cut anything that fails two or more questions
- Document baseline, floor, ceiling, and definition for survivors
- Rebuild the dashboard with survivors only
For additional context on what makes a measurement framework trustworthy, the Harvard Business Review guide to KPIs and the Nielsen Norman Group research library both offer solid grounding.
Related: How to build a simple measurement plan before your first campaign shows how to pick the right metrics upfront — ideally, before you ever need a dashboard.
Continue Learning
Explore more about choosing metrics that earn their place:
- Beyond pageviews: advanced metrics that predict business success — moving past vanity tiles.
- The three revenue metrics every small team should track weekly — a minimalist dashboard template.
- How to build a simple measurement plan before your first campaign — upstream thinking that shapes downstream dashboards.
Bottom Line
Before you build — or inherit — a dashboard, run every tile through the four questions every metric should answer: Decision? Owner? Baseline? Definition? If a metric fails any of them, cut it or fix it. Dashboards are for working, not for looking. In conclusion, a smaller dashboard used daily beats a beautiful dashboard used never.

