In 1902, the French colonial government in Hanoi faced a bubonic plague threat. Rats thrived in the dark, predator-free sewers of the new French Quarter. The administration offered a bounty of one cent per rat. They required a severed tail as proof.
The program initially appeared successful. Daily counts exceeded ten thousand. On June 21st, residents submitted 20,112 tails.
The rat population continued to grow.
Administrators eventually identified the failure. Streets contained tail-less rats surviving in the gutters. Residents had established rat farms on the outskirts of the city. They bred rats, harvested tails, and released the breeding stock to multiply. The bounty transformed rats into a commodity. The intervention ended with an outbreak that killed nearly three hundred people.
This mechanism is universal. Interventions often invert themselves. You push a lever to fix a problem and the problem intensifies. This is the Cobra Effect.
Mechanical Models Fail
Most people carry a mechanical model of change. They observe a problem, apply pressure, and expect movement in the intended direction. Real complex systems—markets, ecosystems, software architectures—contain feedback. They consist of agents who read your rules and adjust their behavior.
Donella Meadows observed this counterintuitiveness in systems research.
“People know intuitively where leverage points are. Time after time I’ve done an analysis of a company, and I’ve figured out a leverage point. Then I’ve gone to the company and discovered that there’s already a lot of attention to that point. Everyone is trying very hard to push it [in the wrong direction].”
Leverage points attract focus. Intuition often dictates the wrong direction. The Hanoi government correctly identified the leverage point: the rat population. They incorrectly defined the metric and the incentive. The system pushed back. Agents optimized for the metric (tails) without changing the underlying problem (rats).
Goodhart and Campbell
The pattern exists across economics and sociology.
Goodhart’s Law states that when a measure becomes a target, it ceases to be a good measure. Rewarding dead-rat tails optimizes for tails. The proxy and the goal diverge.
Campbell’s Law asserts that quantitative social indicators used for decision-making distort the processes they monitor. Test scores stop measuring learning. Lines of code stop measuring engineering productivity. Measurement deforms the reality.
Complex systems are not clay. They are populations of decision-makers. They reorganize behavior to optimize against your rules rather than your intent. Powerful incentives trigger aggressive reorganization.
Industrial Echoes
The pattern repeats in modern industry.
- Banking Fraud. Wells Fargo branch managers faced strict cross-selling quotas. Employees opened millions of fake accounts to hit targets and secure bonuses. The incentive intended to deepen relationships; it produced systemic fraud.
- Healthcare Mortality. The Hospital Readmissions Reduction Program penalized hospitals for returns within thirty days. Readmissions dropped. Mortality rose for patients with heart failure. Hospitals avoided readmitting sick patients to preserve their metrics. Patients died at home.
- Climate Degradation. The UN Clean Development Mechanism rewarded factories for destroying HFC-23, a greenhouse gas byproduct. Credits became so lucrative that factories produced more refrigerant specifically to generate HFC-23 for destruction.
The cycle is predictable. Identify a problem. Pick a proxy. Attach an incentive. Watch the system reorganize around the proxy while the underlying problem worsens.
Design for Resistance
Honest interventions are difficult and unsexy.
Assume the system will fight. Every agent inside the system will seek the cheapest path to reward. If your design is fragile to gaming, it will fail. This is the only assumption that survives contact with reality.
Target goals, not parameters. Meadows ranked parameters—tax rates, bounty prices—as the weakest leverage points. Strongest leverage lives in the system’s goals and paradigms. If the goal is counting tails, you get tails. If the goal is eliminating habitat, gaming becomes impossible.
Build feedback for perverse outcomes. Most modern systems are too large for absurdity to be visible on the street. You must measure whether the metric still tracks the goal. The second measurement protects you from the first.
Intervene less. Observe more. Aggressive interventions often deepen global problems. The strongest move is frequently to remove pressure rather than increasing it.
The Competence Trap
The Hanoi administrators were not incompetent. They defined a goal, selected a proxy, and built a clean incentive structure. By every internal metric, the program worked. The world outside the metric failed.
Your model of the system is not the system. Your metric of the goal is not the goal. Agents run their own models. You cannot out-design this reality. You can only stay humble enough to watch your lever invert and modest enough to abandon the program when it does.
Check the rats in your gutter. Ensure they still have their tails.