Entropy (order and disorder)



  In thermodynamics, entropy is often associated with the amount of order, disorder, and/or chaos in a working body" and that internal work associated with these alteractions is quantified energetically by a measure of "entropy" change, according to the following differential expression:[1]

\int \frac{\delta Q}{T} \ge 0

In the years to follow, Ludwig Boltzmann translated these "alterations" into that of a probabilistic view of order and disorder in gas phase molecular systems.

In recent years, in some chemistry publications, there has been a shift away from using the terms "order" and "disorder" to that of the concept of energy dispersion to describe entropy, among other theories. In the 2002 encyclopedia Encarta, for example entropy is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium; as well as a measure of the disorder in the system. [2] In the context of entropy, "perfect internal disorder" is synonymous with "equilibrium", but since that definition is so far different from the usual definition implied in normal speech, the use of the term in science has caused a great deal of confusion and misunderstanding.

Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease in entropy is, however, only possible at the expense of an entropy increase in the surroundings.

History

This "molecular ordering" entropy perspective traces its origins to molecular movement interpretations developed by Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics.[3]

In 1864, Hermann von Helmholtz used the word "Unordnung" (disorder) to describe entropy.[4]

Overview

To highlight the fact that order and disorder are commonly understood to be measured in terms of entropy, below are current science encyclopedia and science dictionary definitions of entropy:

  • Entropy – a measure of the unavailability of a system’s energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.[5]
  • Entropy – a measure of disorder; the higher the entropy the greater the disorder.[6]
  • Entropy – in thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.[7]
  • Entropy – a measure of disorder in the universe or of the availability of the energy in a system to do work.[8]

Entropy and disorder also have associations with thermodynamic system is a measure of the disorder in the arrangements of its particles.[10] In a stretched out piece of rubber, for example, the arrangement of the molecules of its structure has an “ordered” distribution and has zero entropy, while the “disordered” kinky distribution of the atoms and molecules in the rubber in the non-stretched state has positive entropy. Similarly, in a gas, the order is perfect and the measure of entropy of the system is has its lowest value when all the molecules are in one place, whereas when more points are occupied the gas is all the more disorderly and the measure of the entropy of the system has its largest value.[10]

In systems ecology, as another example, the entropy of a collection of items comprising a system is defined as a measure of their disorder or equivalently the relative likelihood of the instantaneous configuration of the items.[11] Moreover, according to theoretical ecologist and chemical engineer negentropy, as a measure of the structural order within an organism.[11]

The mathematical basis with respect to the association entropy has with order and disorder began, essentially, with the famous Boltzmann formula, S = k ln W, which relates entropy S to the number of possible states W in which a system can be found.[13] The relationship between entropy, order, and disorder in the Boltzmann equation is so clear that according to the views of thermodynamic ecologists Sven Jorgensen and Yuri Svirezhev, “it is obvious that entropy is a measure of order or, most likely, disorder in the system.”[13] In this direction, the second law of thermodynamics, as famously enunciated by Rudolf Clausius in 1865, states that:

The entropy of the universe tends to a maximum.

Thus, if entropy is associated with disorder and if the entropy of the universe is headed towards maximal entropy, then many are often puzzled as to the nature of the "ordering" process and operation of evolution in relation to Clausius' most-famous version of the second law, which states that the universe is headed towards maximal “disorder”. In the recent 2003 book SYNC – the Emerging Science of Spontaneous Order by Steven Strogatz, for example, we find “Scientists have often been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate toward a state of greater disorder, greater entropy. Yet all around us we see magnificent structures—galaxies, cells, ecosystems, human beings—that have all somehow managed to assemble themselves.” [14]

The common argument used to explain this is that, locally, entropy can be lowered by external action, e.g. solar heating action, and that this applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, to growing crystals, and to living organisms.[2] This local increase in order is, however, only possible at the expense of an entropy increase in the surroundings; here more disorder must be created.[2][15] The conditioner of this statement suffices that living systems are open systems in which both heat, mass, and or work may transfer into or out of the system. Unlike temperature, the putative entropy of a living system would drastically change if the organism were thermodynamically isolated. If an organism was in this type of “isolated” situation, its entropy would increase markedly as the once-living components of the organism decayed to an unrecognizable mass.[11]

Phase change

Owing to these early developments, the typical example of entropy change ΔS is that associated with phase change. In solids, for example, which are typically ordered on the molecular scale, usually have smaller entropy than liquids, and liquids have smaller entropy than gases and colder gases have smaller entropy than hotter gases. Moreover, according to the absolute zero of temperature, crystalline structures are approximated to have perfect "order" and zero entropy. This correlation occurs because the numbers of different microscopic quantum energy states available to an ordered system are usually much smaller than the number of states available to a system that appears to be disordered.

From his famous 1896 Lectures on Gas Theory, Boltzmann diagrams the structure of a solid body, as shown above, by postulating that each heat is added to a solid, so to make it into a liquid or a gas, a common depiction is that the ordering of the atoms and molecules becomes more random and chaotic with an increase in temperature:

Thus, according to Boltzmann, owing to increases in thermal motion, whenever heat is added to a working substance, the rest position of molecules will be pushed apart, the body will expand, and this will create more molar-disordered distributions and arraignments of molecules. These disordered arrangements, subsequently, correlate, via probability arguments, to an increase in the measure of entropy.[17]

Adiabatic demagnetization

In the quest for ultra-cold temperatures, a temperature lowering technique called adiabatic demagnetization is used, where atomic entropy considerations are utilized which can be described in order-disorder terms.[18] In this process, a sample of solid such as chrome-alum salt, whose molecules are equivalent to tiny magnets, is inside an insulated enclosure cooled to a low temperature, typically 4 kelvins or 2 kelvins, with a strong magnetic field being applied to the container using a powerful external magnet, so that the tiny molecular magnets are aligned forming a well-ordered "initial" state at that low temperature. This magnetic alignment means that the magnetic energy of each molecule is minimal.[19] The external magnetic field is then reduced, a removal that is considered to be closely reversible. Following this reduction, the atomic magnets then assume random less-ordered orientations, owing to thermal agitations, in the "final" state:

 

The "disorder" and hence the entropy associated with the change in the atomic alignments has clearly increased.[18] In terms of energy flow, the movement from a magnetically aligned state requires energy from the thermal motion of the molecules, converting thermal energy into magnetic energy.[19] Yet, according to the temperature of the specimen must decrease by the same amount.[18] The temperature thus falls as a result of this process of thermal energy being converted into magnetic energy. If the magnetic field is then increased, the temperature rises and the magnetic salt has to be cooled again using a cold material such as liquid helium.[19]

See also

References

  1. ^ Mechanical Theory of Heat – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius [1850-1865]
  2. ^ a b c d Microsoft ® Encarta ® 2006. © 1993-2005 Microsoft Corporation. All rights reserved.
  3. ^ Mahon, Basil (2003). The Man Who Changed Everything – the Life of James Clerk Maxwell. Hoboken, NJ: Wiley. ISBN 0-470-86171-1. 
  4. ^ Anderson, Greg (2005). Thermodynamics of Natural Systems. Cambridge University Press. ISBN 0-521-84772-9. 
  5. ^ Oxford Dictionary of Science, 2005
  6. ^ Oxford Dictionary of Chemistry, 2004
  7. ^ Barnes & Noble's Essential Dictionary of Science, 2004
  8. ^ Gribbin's Encyclopedia of Particle Physics, 2000
  9. ^
  10. ^ a b Greven, Andreas; Keller, Gerhard; Warnercke, Gerald (2003). Entropy – Princeton Series in Applied Mathematics. Princeton University Press. ISBN 0-691-11338-6. 
  11. ^ a b c d Ulanowicz, Robert, E. (2000). Growth and Development – Ecosystems Phenomenology. toExcel Press. ISBN 0-595-00145-9. 
  12. ^ Kubat, L.; Zeman, J. (1975). Entropy and Information in Science and Philosophy. Elsevier. 
  13. ^ a b Jorgensen, Sven, J.; Svirezhev, Yuri, M.; (2004). Towards a Thermodynamic Theory for Ecological Systems. Elsevier. ISBN 0-08-044167-X. 
  14. ^ Strogatz, Steven (2003). the Emerging Science of Spontaneous Order. Theia. ISBN 0-7868-6844-9. 
  15. ^
  16. ^ Cercignani, Carlo (1998). Ludwig Boltzmann - the Man Who Trusted Atoms. Oxford University Press. ISBN 0-19-850154-4. 
  17. ^ Boltzmann, Ludwig (1896). Lectures on Gas Theory. Dover (reprint). ISBN 0-486-68455-5. 
  18. ^ a b c Halliday, David; Resnick, Robert (1988). Fundamentals of Physics, Extended 3rd ed.. Wiley. ISBN 0-471-81995-6. 
  19. ^ a b c NASA - How does an Adiabatic Demagnetization Refrigerator Work ?
 
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Entropy_(order_and_disorder)". A list of authors is available in Wikipedia.