Boltzmann's entropy formula



  In macrostate:

S = k \log W \!           (1)

where k is equation of state for rubber molecules using Boltzmann's formula, which has since come to be known as the entropy model of rubber.

History

The equation was originally formulated by kinetic theory of gases."

The value of W, specifically, is the Wahrscheinlichkeit, or number of possible permutations

W = N!\; / \; \prod_i N_i!           (2)

where i ranges over all possible molecular conditions and ! denotes factorial. The "correction" in the denominator is due to the fact that identical particles in the same condition are indistinguishable. W is sometimes called the "thermodynamic probability" since it is an integer greater than one, while mathematical probabilities are always numbers between zero and one.

Generalization

Boltzmann's formula applies to microstates of the universe as a whole, each possible microstate of which is presumed to be equally probable.

But in thermodynamics it is important to be able to make the approximation of dividing the universe into a system of interest, plus its surroundings; and then to be able to identify the entropy of the system with the system entropy in Classical thermodynamics. The microstates of such a thermodynamic system are not equally probable -- for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath.

For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:

S = - k \sum p_i \log p_i           (3)

This reduces to equation (1) if the probabilities pi are all equal.

Boltzmann used a ρlogρ formula as early as 1866.[4] He interpreted ρ as a density in phase space -- without mentioning probability -- but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later work[5] and recognized it as more general than equation (1). That is, equation (1) is a corollary of equation (3) -- and not vice versa. In every situation where equation (1) is valid, equation (3) is valid also -- and not vice versa.

Boltzmann entropy excludes statistical dependencies

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle -- i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles, and may or may not be a good approximation for other systems. [6]

See also

References

  1. ^ See: photo of Boltzmann's grave in the Zentralfriedhof, Vienna, with bust and entropy formula.
  2. ^ Boltzmann equation – Eric Weisstein’s World of Physics (states the year was 1872)
  3. ^ Perrot, Pierre (1998). A to Z of Thermodynamics. Oxford University Press. ISBN 0-19-856552-6.  (states the year was 1875)
  4. ^ Ludwig Boltzmann. "Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie". Wiener Berichte 53: 195–220.
  5. ^ Ludwig Boltzmann (1896 and 1898). Vorlesungen über Gastheorie. J.A. Barth, Leipzig. 
  6. ^ Jaynes, E. T. (1965). Gibbs vs Boltzmann entropies. American Journal of Physics, 33, 391-8.

External links

  • Introduction to Boltzmann's Equation
 
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Boltzmann's_entropy_formula". A list of authors is available in Wikipedia.