Entropy and Disorder
Why "entropy = disorder" is wrong, and what entropy actually measures.
Why "entropy = disorder" is wrong, and what entropy actually measures.
Entropy is often explained as “disorder,” but that framing is misleading and loses the real insight.
“Entropy always increases, so the universe tends toward disorder.”
This makes entropy sound like messiness. A messy room has high entropy; a clean room has low entropy. But a room isn’t a thermodynamic system in the relevant sense, and “messy” isn’t a physical quantity.
Entropy quantifies how many microscopic arrangements produce the same macroscopic state.
Example — air molecules in a room:
The second state is more probable, not more “messy.” Entropy is a statement about probability distributions, not aesthetics.
In information theory, entropy measures uncertainty in a random variable :
A coin flip has bit. A loaded coin (95% heads) has bits. You already know roughly what you’ll get, so there’s less information in the outcome.
The correct framing changes how we think about several domains:
| Domain | Wrong framing | Correct framing |
|---|---|---|
| Life | Organisms fight disorder | Life creates local order by exporting entropy to surroundings |
| Computation | Bits just get processed | Every irreversible bit operation increases thermodynamic entropy (Landauer’s principle) |
| Design | Systems get messy over time | Systems drift toward higher-probability states — which may or may not look messy |
The second law of thermodynamics says the total entropy of an isolated system never decreases. The key word is isolated. Open systems (like a living cell, or a computer cooling fan) can decrease their local entropy at the cost of increasing entropy elsewhere.
This is why life is possible. It’s not a violation of thermodynamics — it’s thermodynamics doing exactly what it says.