06-reference

reforge why analytics efforts fail

Thu Apr 02 2026 20:00:00 GMT-0400 (Eastern Daylight Time) ·article ·source: https://www.reforge.com/blog/why-most-analytics-efforts-fail ·by Reforge (Brian Balfour)

Reforge — Why Most Analytics Efforts Fail

Summary

Most companies describe their data as “a mess,” but that’s a symptom, not the disease. The mental model breaks analytics failure into four symptoms and five root causes, then offers a framework for fixing them.

Four symptoms of broken analytics:

  1. Lack of shared language — Different teams define the same metric differently, rendering data discussions unproductive.
  2. Slow transfer of knowledge — When people switch roles/teams/companies (avg every 18 months), institutional data knowledge walks out the door.
  3. Lack of trust — “Is that really right?” becomes the default reaction to any data presented.
  4. Inability to act quickly — All three symptoms above compound into paralysis. Teams skip data entirely because using it takes too long.

Five root causes (what most teams miss):

  1. Tracking metrics vs. analyzing them — The goal isn’t to report numbers; it’s to separate what successful users do from what failed users do. This distinction fundamentally changes what you track and how.
  2. Developer mindset vs. business user mindset — Data teams build for themselves instead of their actual customer (business users).
  3. Wrong level of abstraction — Events that are too broad or too specific. Great tracking balances the two. Different “eras” of implementation create clashing abstractions.
  4. Written-only vs. visual communication — Bad teams have no documentation. Good teams have an event dictionary. Great teams combine written documentation with visual journey maps.
  5. Data as a project vs. ongoing initiative — Treating data as a one-time project instead of a product that requires constant iteration leads to the “Data Wheel of Death.”

The journey framework for event tracking: every user action should map to Intent -> Success -> Failure events. This is the right level of abstraction. Failure events split into implicit (user disappears from the journey) and explicit (something goes wrong).

Diagnostic exercise: “Decisions Made Without Data” — Each quarter, track decisions the broader team made without data. This surfaces the highest-value gaps in your analytics coverage.

Relevance to projects:

Connects to 06-reference/2026-04-03-analytics-engineering-everywhere (analytics engineering as a response to these failures), 06-reference/2026-04-03-analytics-at-a-crossroads (industry-level view of analytics maturity), 06-reference/2026-04-03-headless-bi (tooling approach to the abstraction problem), and 06-reference/2026-04-03-scaling-data-informed-driven-led (organizational maturity model).

Signals of success (from the article):

Open Questions