Entropy: Difference between revisions

From SklogWiki
Jump to navigation Jump to search
m (→‎Statistical mechanics: Added an internal link)
m (Slight tidy.)
Line 1: Line 1:
{{Stub-general}}
:'' "Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." ''
{{Cleanup-rewrite}}
::::: '''[[Constantino Tsallis]]''' <ref>http://www.mlahanas.de/Greeks/new/Tsallis.htm</ref>
'''Entropy''' was first described by [[Rudolf Julius Emanuel Clausius]] in 1865 (Ref. 1). The [[statistical mechanics | statistical mechanical]] desciption is due to [[Ludwig Eduard Boltzmann]] (Ref. ?).
'''Entropy''' was first described by [[Rudolf Julius Emanuel Clausius]] in 1865 <ref>[http://dx.doi.org/10.1002/andp.18652010702 R. Clausius "Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie", Annalen der Physik und Chemie '''125''' pp. 353-400 (1865)]</ref>. The [[statistical mechanics | statistical mechanical]] desciption is due to [[Ludwig Eduard Boltzmann]] (Ref. ?).
==Classical thermodynamics==
==Classical thermodynamics==
In [[classical thermodynamics]] one has the '''entropy''', S,
In [[classical thermodynamics]] one has the entropy, <math>S</math>,
:<math>{\mathrm d} S = \frac{\delta Q_{\mathrm {reversible}}} {T} </math>
:<math>{\mathrm d} S = \frac{\delta Q_{\mathrm {reversible}}} {T} </math>


where <math>Q</math> is the [[heat]] and <math>T</math> is the [[temperature]].  
where <math>Q</math> is the [[heat]] and <math>T</math> is the [[temperature]].  
==Statistical mechanics==
==Statistical mechanics==
In [[statistical mechanics]] the '''entropy''', S, is defined by
In [[statistical mechanics]] entropy is defined by


:<math>\left. S \right. = -k_B \sum_m p_m \ln p_m</math>
:<math>\left. S \right. = -k_B \sum_m p_m \ln p_m</math>
Line 35: Line 35:
==See also:==
==See also:==
*[[Entropy of a glass]]
*[[Entropy of a glass]]
*[[H-theorem]]
*[[Non-extensive thermodynamics]]
*[[Shannon entropy]]
*[[Shannon entropy]]
*[[Tsallis entropy]]
==References==
*[[H-theorem]]
<references/>
 
'''Related reading'''
==Interesting reading==
*[http://dx.doi.org/10.1119/1.1990592 Karl K. Darrow "The Concept of Entropy",  American Journal of Physics '''12''' pp.  183-196 (1944)]
*[http://dx.doi.org/10.1119/1.1990592 Karl K. Darrow "The Concept of Entropy",  American Journal of Physics '''12''' pp.  183-196 (1944)]
*[http://dx.doi.org/10.1119/1.1971557 E. T. Jaynes "Gibbs vs Boltzmann Entropies",  American Journal of Physics '''33''' pp. 391-398 (1965)]
*[http://dx.doi.org/10.1119/1.1971557 E. T. Jaynes "Gibbs vs Boltzmann Entropies",  American Journal of Physics '''33''' pp. 391-398 (1965)]
Line 45: Line 46:
*[http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html S. F. Gull "Some Misconceptions about Entropy" in Brian Buck and Vincent A. MacAulay (Eds.) "Maximum Entropy in Action", Oxford Science Publications (1991)]
*[http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html S. F. Gull "Some Misconceptions about Entropy" in Brian Buck and Vincent A. MacAulay (Eds.) "Maximum Entropy in Action", Oxford Science Publications (1991)]
*[http://dx.doi.org/10.2174/1874396X00802010007 Efstathios E. Michaelides "Entropy, Order and Disorder", The Open Thermodynamics Journal '''2''' pp. (2008)]
*[http://dx.doi.org/10.2174/1874396X00802010007 Efstathios E. Michaelides "Entropy, Order and Disorder", The Open Thermodynamics Journal '''2''' pp. (2008)]
 
*Ya. G. Sinai, "On the Concept of Entropy of a Dynamical System," Doklady Akademii Nauk SSSR '''124''' pp. 768-771 (1959)
==References==
*[http://dx.doi.org/10.1063/1.1670348 William G. Hoover "Entropy for Small Classical Crystals", Journal of Chemical Physics '''49''' pp. 1981-1982 (1968)]
#[http://dx.doi.org/10.1002/andp.18652010702 R. Clausius "Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie", Annalen der Physik und Chemie '''125''' pp. 353-400 (1865)]
[[category: statistical mechanics]]
#Ya. G. Sinai, "On the Concept of Entropy of a Dynamical System," Doklady Akademii Nauk SSSR '''124''' pp. 768-771 (1959)
[[category: Classical thermodynamics]]
#[http://dx.doi.org/10.1063/1.1670348 William G. Hoover "Entropy for Small Classical Crystals", Journal of Chemical Physics '''49''' pp. 1981-1982 (1968)]
[[category:statistical mechanics]]
[[Classical thermodynamics]]

Revision as of 15:38, 11 November 2009

"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step."
Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?).

Classical thermodynamics

In classical thermodynamics one has the entropy, Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle S} ,

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle {\mathrm d} S = \frac{\delta Q_{\mathrm {reversible}}} {T} }

where Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle Q} is the heat and Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle T} is the temperature.

Statistical mechanics

In statistical mechanics entropy is defined by

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \left. S \right. = -k_B \sum_m p_m \ln p_m}

where Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle k_B} is the Boltzmann constant, m is the index for the microstates, and Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle p_m} is the probability that microstate m is occupied. In the microcanonical ensemble this gives:

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \left.S\right. = k_B \ln \Omega}

where Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \Omega} (sometimes written as Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle W} ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Arrow of time

Articles:

Books:

  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:

References

Related reading