Concepts
Systèmes Métriques

Goodhart's Law

Origin : Charles Goodhart, 1975

When a measure becomes a target, it ceases to be a good measure. Optimizing for an indicator corrupts it.

As soon as people explicitly optimize for a metric, they adjust their behavior to maximize that metric — often at the expense of the underlying goal the metric was supposed to measure.


Origin

Charles Goodhart is a British economist at the Bank of England. In 1975, in the context of monetary policy, he observed that when the government uses an indicator as a target — the M3 money supply — that indicator loses its predictive reliability. Economic agents adapt to the measure itself.

The formulation everyone knows came from anthropologist Marilyn Strathern (1997):

“When a measure becomes a target, it ceases to be a good measure.”

What began as a technical observation in monetary economics became a universal principle about incentive systems.


The Theory

The law rests on a simple mechanism: indicators measure reality imperfectly, by proxy. As long as they remain indicators, the imprecision is acceptable. When they become targets, actors optimize directly for the proxy — without necessarily advancing the underlying reality.

The measure was meant to reflect reality. It ends up replacing it.

Two related laws illuminate the same phenomenon: Campbell’s Law (1976), which describes the corruption of social indicators used as decision criteria, and the McNamara Fallacy, which describes the bias of considering only what is measurable while ignoring what is not.


In Practice

In education: “Teaching to the test” — teaching only what appears on the exam, at the expense of deep understanding of the subject.

In business: “Number of lines of code written” as a productivity measure → verbose, unnecessarily complex, hard-to-maintain code. Developers optimize for the line count, not for value.

In public health: Targets for emergency room waiting times led to keeping patients in ambulances outside the doors so the clock wouldn’t start. The statistics improved; patient health did not.

In social media: Optimizing for engagement (likes, shares, comments) leads to emotionally charged, divisive content designed to trigger reactions — not to inform or build a lasting relationship with an audience.


Nuances and Limits

Goodhart’s Law does not say that measuring is useless. Metrics are essential for navigating complexity. It says that the relationship between a metric and the goal it represents degrades as soon as the metric becomes an explicit target.

The solution is not to stop measuring, but to measure without turning measurements into primary objectives. Several practices help avoid the trap: multiplying indicators (making it harder to optimize for just one), adding friction to tracking (reducing the permanent salience of numbers), clearly distinguishing analysis (understanding what is happening) from optimization (seeking to maximize a number).

The trap is subtle because it sets in gradually. You start by observing the numbers, and you end up writing for them.

Sources: Goodhart, C. (1975). “Problems of Monetary Management” · Strathern, M. (1997). “Improving ratings” · Wikipedia — Goodhart’s law

Concepts