Padova · IT
● EVERGREEN NOTE information-theoryphysicsphilosophy

Entropy and Disorder

Why "entropy = disorder" is wrong, and what entropy actually measures.

Entropy is often explained as “disorder,” but that framing is misleading and loses the real insight.

The Common Misunderstanding

“Entropy always increases, so the universe tends toward disorder.”

This makes entropy sound like messiness. A messy room has high entropy; a clean room has low entropy. But a room isn’t a thermodynamic system in the relevant sense, and “messy” isn’t a physical quantity.

What Entropy Actually Measures

Entropy quantifies how many microscopic arrangements produce the same macroscopic state.

Example — air molecules in a room:

The second state is more probable, not more “messy.” Entropy is a statement about probability distributions, not aesthetics.

Shannon’s Formulation

In information theory, entropy measures uncertainty in a random variable XX:

H(X)=ipilog2pibitsH(X) = -\sum_{i} p_i \log_2 p_i \quad \text{bits}

A coin flip has H=1H = 1 bit. A loaded coin (95% heads) has H0.29H \approx 0.29 bits. You already know roughly what you’ll get, so there’s less information in the outcome.

Why This Matters

The correct framing changes how we think about several domains:

DomainWrong framingCorrect framing
LifeOrganisms fight disorderLife creates local order by exporting entropy to surroundings
ComputationBits just get processedEvery irreversible bit operation increases thermodynamic entropy (Landauer’s principle)
DesignSystems get messy over timeSystems drift toward higher-probability states — which may or may not look messy

The Second Law, Correctly

The second law of thermodynamics says the total entropy of an isolated system never decreases. The key word is isolated. Open systems (like a living cell, or a computer cooling fan) can decrease their local entropy at the cost of increasing entropy elsewhere.

This is why life is possible. It’s not a violation of thermodynamics — it’s thermodynamics doing exactly what it says.