A Comment on Feynman, Entropy, and Uncertainty
17 July 2005 18:13
Not many people know it, but Richard Feynman gave a wonderful series of lectures on the physics of computation--see it here on Amazon.
An example in the book on the relationship of entropy to uncertainty suggested to me a number of other examples. Later, a comment by my friend and colleague Don Lemons explained my example in a more elegant way (see below). Furthermore, Feynman's simple model is continuing to suggest to me other interesting possibilities!
Consider an ideal gas in a piston
occupying the volume V1.
To isothermally compress the gas to a volume
V2 = V1/2, the work
required is

Fig. 1. Initial state of gas occupying a volume V1 at a temperature T.
Because the compression is isothermal at all times the piston is in contact with a thermal heat bath at a temperature T. The work required to compress the gas from the volume V1 to V2 is
Since V2= V1/2, the above equation becomes
The work is negative as a result of the convention that work done on a gas (rather than by it) is negative. Feynman then goes on to explain:
Now, ordinarily, when we compress a gas, we heat it up. This is a result of its constituent atoms speeding up and gaining kinetic energy. However, in our case, if we examine the molecules of the gas before and after compression, we find no difference, There are the same number , and they are jiggling about no more or less energetically than they were before. There is no difference between the two at the molecular level. So where did the work go? We put some in to compress the gas, and conservation of energy says it had to go somewhere. In fact, it was converted into internal gas heat, but was promptly drained off into the thermal bath, keeping the gas at the same temperature.

Fig. 2. Final state of gas occupying a volume V2 = V1/2 at temperature T.
Feynman goes on to say that what did change during the compression is the state of the gas, and introduces the concept of entropy which he describes as rather a bizarre quantity. Because a quantity of heat dQ was exchanged during the process at a constant temperature T, the entropy change of the gas by the Second Law is
Now, the isothermal compression did not change the gas temperature, and the internal energy of the gas was unaffected. From the First Law of Thermodynamics, the work done on the system by compression can be equated with the heat flow out of the system into the thermal bath. Hence dQ = dW and the entropy is given by
and the entropy of the gas has decreased. (Of course, being a reversible process, the total entropy of the system including the thermal bath must be zero). Finally, the following leads to Feynman's main argument:
Now we take a bit of a leap, and it is not obvious that we can do this, but we can. We consider the case where our gas contains only one molecule. That is we put N = 1 into our formulae. Now it's difficult to get a feeling for concepts like temperature...when you only have one molecule! However, these concepts make sense as long as we consider them to be time averaged, smoothing out the irregularties of this one particle as it bounces back and forth. Indeed, our formulae actually work with N = 1, as long as there is this hidden smoothing. The situation is more fun, too!
In other words, Feynman is correctly arguing that we can set N = 1, because of the well-known fact in statistical physics that an ensemble average is equivalent to a time average. So the single particle is bouncing around in the initial volume V1 shown in Fig. 1, and after the compression the particle is bouncing around in the half-volume V2 shown in Fig. 2. The physical state of the system before and after the compression is the same--except for the entropy. Setting N = 1 in the entropy equation above, the change in entropy when the volume is halved is now ΔS = - k log 2. Feynman continues:
What has happened, and this is very subtle, is that my knowledge of the possible locations of the molecule has changed. In the initial state, it could be hiding anywhere in volume V1; after the compression, it must be somewhere within V2. In other words, there are fewer places it can be in....This concept of "knowledge" is extremely important, and central to the concept of entropy.
Well, I initially thought that this example was nicely illustrative of the tie-in between entropy and uncertainty. But then I thought: well, what about the case of an adiabatic compression from V1 to V2? In that case, there is no change in entropy since dQ = 0, but the particle is similarly constrained to the final (smaller) volume V2. Why is there no entropy change for this case?
I think I have a way to explain this counter-example I posed in the paragraph above. One difference between adabiatic compression and isothermal compression is that the single particle has gained energy (i.e., temperature). The increase in temperature gives the particle a higher velocity, and this must result in an increase in the uncertainty in the particle location. This increase may cancel out the decrease in uncertainty due to a smaller final piston volume. This increase in uncertainty due to higher velocity mitigates the decrease in uncertainty due to adiabatic compression, resulting in no net increase in uncertainty (entropy).
Soon after I published this comment, Don Lemons (author of the well-known book Stochastic Processes in Physics) wrote to me and suggested a better way to explain this:
When the gas is adiabatically compressed the uncertainty in the position of the molecule decreases because there is less volume for it to be in. But the volume it occupies in velocity space increases. This last sentence is true even when thinking about one particle -- because you know nothing about the components of the particle velocity and the total speed is increased. Entropy is, according to the statistical and information theory interpretation, a measure of the total phase space of the particles making up the gas.
Further Comments: I think that the analogies above help to understand the similarites and differences between free expansion and adiabatic expansion of an ideal gas. Adiabatic expansion of a gas from a small volume V2 to a large volume V1 is a reversible process which decreases the gas temperature. The gas temperature decreases because the internal energy of the gas must be used to work against the piston in the expansion. To use the analogy above, the configuration space of the molecule increases, but the velocity space decreases because of cooling during the expansion. The net increase in uncertainty (i.e., the entropy) is zero.
Free expansion of an ideal gas is adiabatic--but no work is done by the gas in freely expanding from V2 to V1. From the First Law of Thermodynamics, there can be no change in internal energy since both dQ = 0 and dW = 0. Hence, the gas temperature before and after the expansion is the same, as Joule first experimentally demonstrated in 1845. So entropy increases because the configuration space of the molecule increases, while the velocity space of the molecule is unchanged. The net effect is an increase in entropy, as must be true of any irreversible process.
Note also that free expansion of an ideal gas, like all irreversible processes, is indicative of a loss in being able to perform useful work. This is readily seen when considering the two examples above. Instead of freely expanding a gas from a volume V to a new volume V+dV, we could have connected a piston to the gas to obtain useful work from the gas expansion. An irreversible process is always a process for which there is lost work. Energy can be conserved in an irreversible processes, but the energy is, in a sense, degraded, since it can no longer be used to perform useful work. In fact, during an irreversible process, work must be performed to put the system back into its initial state. In the frictional motion of a mass on a surface, energy is conserved since the work required to move the mass goes into both kinetic energy and frictional heat generation. But the frictional heat generated during the process cannot be used to perform useful work, which is another way of stating the Second Law.
Questions:
Can these analogies be extended to isobaric processes where the number of particles--and not their phase space--is changed. For example, see the work of Humphrey and Linke (2005).
See also: Computation; Information Theory; Physics; Quantum Mechanics
Bibliography
Feynman Lectures on Computation, by R. Feynman, T. Hey, and R. Allen (Westview Press, Oxford, 1999).
T.E. Humphrey and H. Linke, Phys. Rev. Lett. 94, 096601 (2005).