Gibbs paradox



 

Originally considered by irreversibility in thermodynamic systems.

Suppose we have a box divided in half by a movable partition. On one side of the box is an Entropy of mixing of liquids, solids and solutions can be calculated in a similar fashion and Gibbs paradox can be applied to liquids, solids and solutions in condensed phases as well as the gaseous phase.

Similarity and entropy of mixing

When Gibbs paradox is discussed, the correlation of the entropy of mixing with similarity is always very controversial and there are three very different opinions regarding the entropy value as related to the similarity (Figures a, b and c). Similarity may change continuously: similarity Z=0 if the components are distinguishable; similarity Z=1 if the parts are indistinguishable. Entropy of mixing does not change continuously in the Gibbs paradox.


There are many claimed resolutions [3] and all of them fall into one of these three kinds of entropy of mixing-similarity relationship (Figures a, b and c).

A resolution corresponding to Figure (a) consists of accepting the discontinuity as fact, and stating that the common sense and intuitive objections to it are unfounded. This is the resolution given by Gibbs, and clarified by Jaynes[4].

John von Neumann provided an alternative resolution of the Gibbs paradox by removing the discontinuity of the entropy of mixing: it decreases continuously with the increase in the property similarity of the individual components (See Figure b). More recently Shu-Kun Lin provided still another relationship (See Figure c). They will be explained in detail in a following section.

Entropy discontinuity

Classical explanation in thermodynamics

Gibbs himself posed a solution to the problem which many scientists take as Gibbs's own resolution of the Gibbs paradox.[4][5] The crux of his resolution is the fact that if one develops a classical theory based on the idea that the two different types of gas are indistinguishable, and one never carries out any measurement which reveals the difference, then the theory will have no internal inconsistencies. In other words, if we have two gases A and B and we have not yet discovered that they are different, then assuming they are the same will cause us no theoretical problems. If ever we perform an experiment with these gases that yields incorrect results, we will certainly have discovered a method of detecting their difference.

This insight suggests that the concept of thermodynamic state and entropy are somewhat subjective. The increase in entropy as a result of mixing multiplied by the temperature is equal to the minimum amount of work we must do to restore the gases to their original separated state. Suppose that the two different gases are separated by a partition, but that we cannot detect the difference between them. We remove the partition. How much work does it take to restore the original thermodynamic state? None—simply reinsert the partition. The fact that the different gases have mixed does not yield a detectable change in the state of the gas, if by state we mean a unique set of values for all parameters that we have available to us to distinguish states. The minute we become able to distinguish the difference, at that moment the amount of work necessary to recover the original macroscopic configuration becomes non-zero, and the amount of work does not depend on the magnitude of the difference.

The paradox is resolved by arguing that the discontinuity is real, and that any "common sense" or "intuitive" objection to it is unfounded.

Explanation in statistical mechanics and quantum mechanics, N! and entropy extensivity

A large number of scientists believe that this paradox is resolved in Sackur-Tetrode equation.

The state an momentum vector p and the position vector x for each particle. This can be thought of as specifying a point in a 6N-dimensional phase space, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles. The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy:

U=\frac{1}{2m}\sum_{i=1}^{N} \sum_{j=1}^3 p_{ij}^2

and be contained inside of the volume V (let's say V is a box of side X so that X³=V):

0 \le x_{ij} \le X

for i=1..N and j=1..3.

The first constraint defines the surface of a 3N-dimensional hypersphere of radius (2mU)1/2 and the second is a 3N-dimensional hypercube of volume VN. These combine to form a 6N-dimensional hypercylinder. Just as the area of the wall of a cylinder is the circumference of the base times the height, so the area φ of the wall of this hypercylinder is:

\phi(U,V,N) = V^N \left(\frac{2\pi^{\frac{3N}{2}}(2mU)^{\frac{3N-1}{2}}}{\Gamma(3N/2)}\right)

The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. Another way of stating Heisenberg's uncertainty principle is to say that we cannot specify a volume in phase space smaller than h3N where h is Planck's constant. The above "area" must really be a shell of a thickness equal to the uncertainty in momentum Δp so we therefore write the entropy as:

\left.\right. S=k\,\ln(\phi \Delta p/h^{3N})

where the constant of proportionality is k, Boltzmann's constant.

We may take the box length X as the uncertainty in position, and from Heisenbergs uncertainty principle, X\Delta p=\hbar/2. Solving for Δp, using Stirling's approximation for the Gamma function, and keeping only terms of order N the entropy becomes:

S = k N \log \left[ V  \left(\frac UN \right)^{\frac 32}\right]+ {\frac 32}kN\left( 1+ \log\frac{4\pi m}{3h^2}\right)

This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy. Suppose the two volumes are separated by a barrier in the beginning. Removing or reinserting the wall is reversible, but the entropy difference after removing the barrier is

\delta S = k \left[ 2N \log(2V) - N\log V - N \log V \right] = 2 k N \log 2 > 0

which is in contradiction to thermodynamics. This is the Gibbs paradox. It was resolved by J.W. Gibbs himself, by postulating that the gas particles are in fact indistinguishable. This means that all states that differ only by a permutation of particles should be considered as the same point. For example, if we have a 2-particle gas and we specify AB as a state of the gas where the first particle (A) has momentum p1 and the second particle (B) has momentum p2, then this point as well as the BA point where the B particle has momentum p1 and the A particle has momentum p2 should be counted as the same point. It can be seen that for an N-particle gas, there are N! points which are identical in this sense, and so to calculate the volume of phase space occupied by the gas we must divide Equation 1 by N!.[6] This will give for the entropy:

S = k N \log \left[ \left(\frac VN\right)  \left(\frac UN \right)^{\frac 32}\right]+ {\frac 32}kN\left( {\frac 53}+ \log\frac{4\pi m}{3h^2}\right)

which can be easily shown to be extensive. This is the Sackur-Tetrode equation. If this equation is used, the entropy value will have no difference after mixing two parts of the identical gases.

See also:[7]

Entropy continuity

 

Whereas many scientists feel comfortable with the entropy discontinuity shown in Figure (a) and satisfied with the classical or the quantum mechanical explanations in thermodynamics or in statistical mechanics, other people admit that Gibbs paradox is a real paradox which should be resolved by showing entropy continuity.

A quantum mechanics resolution of Gibbs paradox

Not many scientists have set out to prove that entropy of mixing is actually continuous. In his book Mathematical Foundations of Quantum Mechanics,[8] John von Neumann provided, for the first time, a resolution to the Gibbs paradox by removing the discontinuity of the entropy of mixing: it decreases continuously with the increase in the property similarity of the individual components (See Figure b).

On page 370 of the English version of this book,[8] it reads that " ... This clarifies an old paradox of the classical form of thermodynamics, namely the uncomfortable discontinuity in the operation with semi-permeable walls... We now have a continuous transition."

A few scientists agree with this resolution, others are still not convinced.

An information theory resolution of Gibbs paradox

Another entropy continuity relation has been proposed by Shu-Kun Lin[3] based on information theory consideration, as shown in Figure (c). A calorimeter might be employed to determine the (information theory) entropy if the two parts of the gas container are used to record 2 bits of information.

For condensed phases, instead of the word "mixing", the word "merging" can be used for the process of combining several parts of substance originally in several containers. Then, it is always a merging process, whether the substances are very different or very similar or even the same. Conventional way of Avogadro's number; and there will be at most only 2 bits of information left.

References and Notes

  1. ^ Gibbs, J. Willard, (1876). Transactions of the Connecticut Academy, III, pp. 108-248, Oct. 187-May, 1876, and pp. 343-524, May, 1877-July, 1878.
  2. ^ Gibbs, J. Willard (1993). The Scientific Papers of J. Willard Gibbs - Volume One Thermodynamics. Ox Bow Press. ISBN 0-918024-77-3. 
  3. ^ a b A list of publications at the website: Gibbs paradox and its resolutions.
  4. ^ a b Jaynes, E.T. (1996). The Gibbs Paradox (PDF). Retrieved on November 8, 2005. (Jaynes, E. T. The Gibbs Paradox, In Maximum Entropy and Bayesian Methods; Smith, C. R.; Erickson, G. J.; Neudorfer, P. O., Eds.; Kluwer Academic: Dordrecht, 1992, p.1-22)
  5. ^ Ben-Naim, Arieh (2007). “On the So-Called Gibbs Paradox, and on the Real Paradox.” Entropy (an Open Access journal), 9(3): 132-136.
  6. ^ a b Gibbs, J. Willard (1902). Elementary principles in statistical mechanics. New York. ; (1981) Woodbridge, CT: Ox Bow Press ISBN 0-918024-20-X
  7. ^ Allahverdyan, A.E.; Nieuwenhuizen, T.M. (2006). “Explanation of the Gibbs paradox within the framework of quantum thermodynamics.” Physical Review E, 73 (6), Art. No. 066119. (Link to the paper at the journal's website).
  8. ^ a b von Neumann, John (1932.). Mathematical Foundations of Quantum Mechanics. Princeton U. Press. reprinted, 1996 edition: ISBN 0-691-02893-1.  (Translated from German by Robert T. Beyer)
  9. ^ Caution: Do not do mixing experiments if you are not supervised by a professional chemist in a laboratory. A large amount of heat may be released, which can be exclusively attributed to the chemical reactions occurred in the mixture.
  10. ^ This conclusion might be taken as an experimental resolution of Gibbs paradox for ideal gases.
  11. ^ By (information theory) entropy (also sometimes known as information theory entropy, informational entropy, information entropy, or Shannon entropy), we mean it is a dimensionless logarithmic function S = lnw in information theory. It is not a function of temperature T. It is not necessarily related to energy.
 
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Gibbs_paradox". A list of authors is available in Wikipedia.