Here Cosma Shalizi argues that in the Bayesian approach to probability theory, entropy would decrease as a function of time, contrary to what's observed. He concludes: "Avoiding this unphysical conclusion requires rejecting the ordinary equations of motion, or practicing an incoherent form of statistical inference, or rejecting the identification of uncertainty and thermodynamic entropy."

His argument uses more jargon than strictly necessary, which may intimidate potential critics, so let me summarize it.

SHORT STATEMENT: Entropy is a measure of ignorance. Suppose we start off with some ignorance about a situation and then occasionally make measurements as time passes. The past completely determines the future, so our ignorance never increases. When we make measurements, our ignorance decreases. So, entropy drops.

MATHEMATICAL STATEMENT: Suppose we describe a situation using a probability distribution on a measure space X. The entropy of this probability distribution says how ignorant we are: the more smeared-out the distribution, the higher its entropy. Suppose that as time passes, two kinds of processes occur. 1) As time passes without measurements being made, the probability distribution changes via a measure-preserving function f: X -> X. This does not change its entropy. 2) When a measurement is made, we Bayes' rule to update our probability distribution. This decreases its entropy.

You may take my word for this: the mathematical statement is correct. The question is thus whether he's drawing the correct conclusions from the math! I have my own opinions - but I prefer to let you mull on it a bit.

Thanks to +Alexander Kruel for pointing this out!

Shared publiclyView activity