Here Cosma Shalizi argues that in the Bayesian approach to probability theory, entropy would

His argument uses more jargon than strictly necessary, which may intimidate potential critics, so let me summarize it.

SHORT STATEMENT: Entropy is a measure of ignorance. Suppose we start off with some ignorance about a situation and then occasionally make measurements as time passes. The past completely determines the future, so our ignorance never increases. When we make measurements, our ignorance decreases. So, entropy drops.

MATHEMATICAL STATEMENT: Suppose we describe a situation using a probability distribution on a measure space X. The entropy of this probability distribution says how ignorant we are: the more smeared-out the distribution, the higher its entropy. Suppose that as time passes, two kinds of processes occur. 1) As time passes without measurements being made, the probability distribution changes via a measure-preserving function f: X -> X. This does not change its entropy. 2) When a measurement is made, we Bayes' rule to update our probability distribution. This decreases its entropy.

You may take my word for this: the mathematical statement is correct. The question is thus whether he's drawing the correct conclusions from the math! I have my own opinions - but I prefer to let you mull on it a bit.

Thanks to +Alexander Kruel for pointing this out!

#physics

*decrease*as a function of time, contrary to what's observed. He concludes: "Avoiding this unphysical conclusion requires rejecting the ordinary equations of motion, or practicing an incoherent form of statistical inference, or rejecting the identification of uncertainty and thermodynamic entropy."His argument uses more jargon than strictly necessary, which may intimidate potential critics, so let me summarize it.

SHORT STATEMENT: Entropy is a measure of ignorance. Suppose we start off with some ignorance about a situation and then occasionally make measurements as time passes. The past completely determines the future, so our ignorance never increases. When we make measurements, our ignorance decreases. So, entropy drops.

MATHEMATICAL STATEMENT: Suppose we describe a situation using a probability distribution on a measure space X. The entropy of this probability distribution says how ignorant we are: the more smeared-out the distribution, the higher its entropy. Suppose that as time passes, two kinds of processes occur. 1) As time passes without measurements being made, the probability distribution changes via a measure-preserving function f: X -> X. This does not change its entropy. 2) When a measurement is made, we Bayes' rule to update our probability distribution. This decreases its entropy.

You may take my word for this: the mathematical statement is correct. The question is thus whether he's drawing the correct conclusions from the math! I have my own opinions - but I prefer to let you mull on it a bit.

Thanks to +Alexander Kruel for pointing this out!

#physics

View 139 previous comments

- +Dmitri Manin Nice! I like it!May 23, 2012
- +Rahul Siddharthan Right, but we can't say for certain that someday technology might find a way to measure these (as long as there is nothing that theoretically prevents it). So my whole point is simply that "reality" shouldn't depend on our technological capabilities since, if it did, there are paradoxes that arise.May 23, 2012
- +Ian Durham again we are back to the question: is entropy "reality"? If we assume that it is knowledge, I don't think any paradoxes arise. I haven't seen any mentioned so far. As Jaynes points out, even the Gibbs paradox is not a paradox if you think of entropy in terms of specifying a macroscopic state appropriately for your knowledge of the system.May 23, 2012
- +Rahul Siddharthan Did you read my blog post by any chance? The paradox is that, since we can set entropy equal to things that are generally interpreted as being states of reality, you
*could*interpret that changing our knowledge can change reality. That, then, leads to paradoxes (e.g. gravity didn't exist until Newton discovered it).May 23, 2012 - I hadn't (travelling and on slow 2G link, posting these from my mobile). Still haven't read carefully. Two quick reactions. 1: what's dV? 2. TdS = pdV is an infinitesimal equation and you need to integrate it over a path. Ditto for pdV. To me it seems, in your setup, both observers would agree it tells us nothing.May 23, 2012
- Ps - also you need to bring in chemical potentials, at least for the observer who thinks they are different gases. My guess is, for that observer the entropy increase will then work out. For the other, zero = zero.May 23, 2012

Add a comment...