Solved – Entropy-based refutation of Shalizi’s Bayesian backward arrow of time paradox

In this paper, the talented researcher Cosma Shalizi argues that to fully accept a subjective Bayesian view, one must also accept an unphysical result that the arrow of time (given by the flow of entropy) should actually go backwards. This is mainly an attempt to argue against the maximum entropy / fully subjective Bayesian view put forward and popularized by E.T. Jaynes.

Over at LessWrong, many contributors are very interested in Bayesian probability theory and also in the subjective Bayesian approach as a basis for formal decision theories and a stepping stone toward strong A.I. Eliezer Yudkowsky is a common contributor there and I was recently reading this post when I came across this comment (several other good comments come shortly after it on the original post's page).

Can anyone comment on the validity of Yudkowsky's rebuttal of Shalizi. Briefly, Yudkowsky's argument is that the physical mechanism by which a reasoning agent updates its beliefs require work and hence has a thermodynamic cost that Shalizi is sweeping under the rug. In another comment, Yudkowsky defends this, saying:

"If you take the perspective of a logically omniscient perfect observer outside the system, the notion of "entropy" is pretty much meaningless, as is "probability" – you never have to use statistical thermodynamics to model anything, you just use the deterministic precise wave equation."

Can any probabilists or statistcal mechanics comment on this? I don't care much about arguments from authority regarding either Shalizi's or Yudkowsky's status, but I would really like to see a summary of the ways that Yudkowsky's three points offer criticism of Shalizi's article.

To conform to FAQ guidelines and make this a concretely answerable question please note that I am asking for a specific, itemized response that takes Yudkowsky's three-step argument and indicates where in the Shalizi article those three steps refute assumptions and/or derivations, or, on the other hand, indicates where in Shalizi's paper the arguments of Yudkowsky are addressed.

I've often heard the Shalizi article touted as iron-clad proof that full-blown subjective Bayesianism can't be defended… but after reading the Shalizi article a few times, it looks like a toy argument to me that could never apply to an observer interacting with whatever is being observed (i.e. all of actual physics). But Shalizi is a great researcher, so I would welcome second opinions because it's highly likely that I don't understand important chunks of this debate.

In short: 1:0 for Yudkowsky.

Cosma Shalizi considers a probability distribution subjected to some measurements. He updates the probabilities accordingly (here it is not important if it is the Bayensian inference or anything else).

No surprising at all, the entropy of the probability distribution decreases.

However, he makes a wrong conclusion that it says something about the arrow of time:

These assumptions reverse the arrow of time, i.e., they make entropy non-increasing.

As it was pointed out in comments, what matters to thermodynamics, is the entropy of a closed system. That is, according to the second law of thermodynamics, entropy of a closed system cannot decrease. It says nothing about the entropy of a subsystem (or an open system); otherwise you couldn't use your fridge.

And once we measure sth (i.e. interact and gather information) it is not a closed system anymore. Either we cannot use the second law, or – we need to consider a closed system made of the measured system and the observer (i.e. ourselves).

In particular, when we measure the exact state of a particle (while before we knew its distribution), indeed we lower its entropy. However, to store the information we need to increase our entropy by at least the same amount (typically there is huge overhead).

So Eliezer Yudkowsky makes a good point:

1) Measurements use work (or at least erasure in preparation for the next measurement uses work).

Actually, the remark about work is not the most important here. While the thermodynamics is about relating (or trading) entropy to energy, you can get around (i.e. we don't need to resort to Landauer's principle, of which Shalizi is skeptical). To gather some new information you need to erase the previous information.

To be consistent with classical mechanics (and quantum as well), you cannot make a function arbitrarily mapping anything to all zeros (with no side effects). You can make a function mapping your memory to all zero, but at the same time dumping the information somewhere, which effectively increases the entropy of the environment.

(The above originates from Hamiltonian dynamics – i.e. preservation of the phase space in the classical case, and unitarity of evolution in the quantum case.)

PS: A trick for today – "reducing entropy":

  • Flip an unbiased coin, but don't look at the result ($H = 1$ bit).
  • Open your eyes. Now you know its state, so its entropy is $H = 0$ bits.

Similar Posts:

Rate this post

Leave a Comment