What Is Entropy?
Core notions of entropy
- Multiple commenters align on entropy as “information you don’t have”: the logarithm of the number of microstates compatible with what you know, or the expected information (average surprise) of a random variable.
- Several emphasize that “entropy = disorder” is misleading; “order” is subjective and depends on which macroscopic properties you care about.
Subjective vs objective, probability, and observers
- Big subthread on whether entropy is a property of the system or the observer.
- Bayesian-leaning view: entropy quantifies an agent’s uncertainty given their model and priors; two people with different knowledge about a loaded die, coin toss, or PRNG seed assign different entropies.
- Frequentist/physical view: the “true” distribution (e.g., of a loaded die) is objective; differing entropies just reflect wrong assumptions.
- Resolution attempts:
- Probability distributions can be used both for subjective beliefs and objective mechanisms (e.g., an LLM’s next-token logits).
- Entropy is then a property of the chosen macrostate description and probability model; macrostate choice itself is partly subjective.
- Related debates touch on omniscient observers, continuous vs discrete systems, and differential entropy’s pathologies (unit dependence, possible negativity).
Thermodynamics and statistical mechanics
- Several note the article, like many popular pieces, emphasizes Shannon-style entropy and downplays the original thermodynamic definition (heat exchange over temperature in reversible processes, third law, measurable in kJ/K).
- Others connect back to standard stat mech: entropy from probabilities over microstates, partition functions, and the role of ergodicity and mixing; Pauli exclusion and Bose/Fermi statistics change microstate counting.
- Some argue thermodynamics is a macroscopic phenomenology on which statistical mechanics is built, not the other way around.
Macrostates, “order,” and context-dependence
- Examples (password strings, cream-in-coffee, melting ice) illustrate that macrostates are defined by what distinctions an observer cares about; what looks “random” to one might be “my password” to another.
- Entropy is thus tied to how we coarse-grain reality into macrostates; different coarse-grainings yield different entropies even for the same underlying microstate.
Meta and critiques
- Complaints about overuse of “entropy” as a buzzword in software (“software entropy”) and elsewhere, blurring rigorous physical/information-theoretic meaning.
- Others defend broad use but stress the need to distinguish genuine generalizations from loose metaphor, and to recognize that many “entropies” share the same -∑p log p form but live in different conceptual frameworks.