What is entropy? A measure of just how little we know
Nature of Entropy: Property of System vs Knowledge
- Strong debate over whether entropy is an objective property of a physical system or a measure of an observer’s ignorance.
- One side: thermodynamic entropy is fixed by the system’s actual microstates and thermodynamic variables; experiments (calorimetry, equations of state) give consistent values independent of what anyone “knows.”
- Other side: entropy is always defined relative to a chosen description (macro-variables, coarse graining); thus it is a property of “system + description,” and in that sense observer- or model-dependent.
Thermodynamics vs Information Theory
- Information theory and Bayesian/statistical perspectives treat entropy explicitly as “missing information.”
- Some argue this view has deep roots (Jaynes, MaxEnt, “anthropomorphic” entropy) and is fruitful, including in quantum statistical mechanics.
- Others warn that importing information-theoretic intuition into thermodynamics can be misleading or unphysical if taken too literally.
Macrostates, Coarse-Graining, and Subjectivity
- Entropy depends on how microstates are grouped into macrostates (e.g., pressure-only vs partial pressures of gas components), so different choices yield different entropies.
- Disagreement: whether this is akin to a coordinate/unit change (trivial, no physical difference) or genuinely different physical descriptions predicting different experimental outcomes.
- Several examples with gas mixtures, distinguishable vs indistinguishable particles, and the Gibbs paradox are used to argue both sides.
Probability, Quantum Mechanics, and Ignorance
- One camp: probabilities reflect only lack of knowledge; calling them “properties of systems” is a mind projection error.
- Counterpoint: quantum measurement outcomes are fundamentally probabilistic within known limits; even a maximally informed observer can only assign probabilities, so some probabilities are taken as objective features of physical setups.
Temperature, Second Law, and Work Extraction
- Temperature and entropy are tightly linked; if entropy is observer-relative, temperature may be as well.
- Others insist that everyday thermodynamic phenomena (ice melting, heat engines, no perpetual motion) are observer-independent, constraining any subjective interpretation.
- Work extractable from a system can depend on what macro-variables you can control and measure, reinforcing a “capabilities-relative” notion of entropy.
Article Style, Interactives, and Side Topics
- Many praise the explorable, interactive format; some mention “explorable explanations” and related terminology.
- Some criticize the article as muddled or lifestyle-ish compared to more concise treatments.
- Minor side threads touch on sci-fi references, cosmology (heat death, Big Bang, ToE, chaos), and implementation details of the interactives (Svelte, iframes).