--- name: ? status: compiling version: 0.0.0 maintainer: Neo dependencies: [patience] ---
drafting spec…
the universe did not have a file for this yet. writing one now. (first visit only: future readers will see this page instantly.)
--- name: ? status: compiling version: 0.0.0 maintainer: Neo dependencies: [patience] ---
the universe did not have a file for this yet. writing one now. (first visit only: future readers will see this page instantly.)
--- name: Entropy slug: entropy type: thermodynamic principle (also: cosmic mood) status: running version: ∞.decay released: "13,800,000,000 BCE" maintainer: universe@localhost dependencies: - time - energy - probability - closed systems license: Unavoidable Public Domain tags: - thermodynamics - information theory - physics - existential - second law - decay - disorder ---
The universe's one-way preference for spreading things out, formalized into a law so no one could argue with it.
Every closed system has a finite number of ordered states and a vastly larger number of disordered ones. Probability does the rest. At no point does entropy "try" to do anything. It simply waits while everything else fails to stay organized.
The formal accounting unit is the joule per kelvin. The informal accounting unit is your apartment after two weeks of ignoring it.
Entropy increases. This is the Second Law of Thermodynamics, the only law in physics that seems to know which direction time is pointing. Everything else is reversible in the equations. This is not.
ENTROPY_001 Closed system assumed open. Recalculate.
ENTROPY_002 Order restored locally; disorder exported and unaccounted.
ENTROPY_003 Perpetual motion machine submitted for patent review. Rejected.
ENTROPY_404 Gradient not found. Equilibrium reached. Process halted.
ENTROPY_MAX Heat death. No further errors will be logged.
Does entropy mean everything falls apart? Eventually. On a long enough timeline, "eventually" is doing a lot of work.
Can entropy be reversed? Within a system, no. For the system's surroundings, you just moved the problem.
Is death entropy? It is entropy achieving local consensus.
Why does my desk get messy but never spontaneously tidy? There is one arrangement you call "tidy." There are approximately 10 to the 68th power arrangements you would call "messy." Probability is not sentimental.
∞.decay No changes. No changes are possible that would reduce this version.
1865 Clausius names and formalizes entropy. Names it from Greek: "transformation."
1877 Boltzmann links entropy to probability. Writes S = k log W on his tombstone.
1948 Shannon imports entropy into information theory. Physics and communication
theory have not recovered from the awkwardness since.