Entropy: Difference between revisions

From SklogWiki
Jump to navigation Jump to search
No edit summary
(Added origin of the word entropy)
 
(26 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{Stub-general}}
:'' "Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." '' '''Constantino Tsallis''' <ref>http://www.mlahanas.de/Greeks/new/Tsallis.htm</ref>
{{Cleanup-rewrite}}
'''Entropy''' was first described by [[Rudolf Julius Emanuel Clausius]] in 1865 <ref>[http://dx.doi.org/10.1002/andp.18652010702 R. Clausius "Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie", Annalen der Physik und Chemie '''125''' pp. 353-400 (1865)]</ref>. The [[statistical mechanics | statistical mechanical]] desciption is due to [[Ludwig Eduard Boltzmann]] (Ref. ?). The word entropy originated from the Greek word meaning a turning or transformation "τροπή" <ref>[https://books.google.es/books?id=8LIEAAAAYAAJ&pg=PA357  Rudolf Clausius "The Mechanical Theory of Heat: With Its Applications to the Steam-engine and to the Physical Properties of Bodies", London (1867) page 357]</ref>.
The '''entropy''', S, is defined by
==Classical thermodynamics==
In [[classical thermodynamics]] one has the entropy, <math>S</math>,
:<math>{\mathrm d} S = \frac{\delta Q_{\mathrm {reversible}}} {T} </math>


:<math>\left. S \right. = -k_B \sum_m p_m \ln p_m</math>
where <math>Q</math> is the [[heat]] and <math>T</math> is the [[temperature]].
==Statistical mechanics==
In [[statistical mechanics]] entropy is defined by


where <math>k_B</math> is the [[Boltzmann constant]], ''m'' is the index for the microstates, and <math>p_m</math>
:<math>\left. S \right. := -k_B \sum_{i=1}^W p_i \ln p_i</math>
is the probability that microstate ''m'' is occupied.  
 
where <math>k_B</math> is the [[Boltzmann constant]], <math>i</math> is the index for the [[microstate |microstates]], and <math>p_i</math>
is the probability that microstate ''i'' is occupied.  
In the [[microcanonical ensemble]] this gives:
In the [[microcanonical ensemble]] this gives:


:<math>\left.S\right. = k_B \ln \Omega</math>
:<math>\left.S\right. = k_B \ln W</math>


where <math>\Omega</math> (sometimes written as <math>W</math>)
where <math>W</math> (sometimes written as <math>\Omega</math>)
is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system.
is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system.
This equation provides a link between [[Classical thermodynamics | classical thermodynamics]] and  
This equation provides a link between [[Classical thermodynamics | classical thermodynamics]] and  
[[Statistical mechanics | statistical mechanics]]
[[Statistical mechanics | statistical mechanics]]
==Tsallis entropy==
Tsallis (or ''non-additive'') entropy <ref>[http://dx.doi.org/10.1007/BF01016429 Constantino Tsallis "Possible generalization of Boltzmann-Gibbs statistics", Journal of Statistical Physics '''52''' pp. 479-487 (1988)]</ref> is defined as (Eq. 1)
:<math>S_q:= k_B \frac{1-\sum_{i=1}^W p_i^q}{q-1}</math>
where <math>q</math> is the ''Tsallis index'' <ref>[http://dx.doi.org/10.1103/PhysRevE.78.021102 Filippo Caruso and Constantino Tsallis "Nonadditive entropy reconciles the area law in quantum systems with classical thermodynamics", Physical Review E '''78''' 021102 (2008)]</ref>.
As <math>q \rightarrow 1 </math> one recovers the standard expression for entropy. This expression for the entropy is the cornerstone of [[non-extensive thermodynamics]].
==Arrow of time==
==Arrow of time==
Articles:
*[http://dx.doi.org/10.1119/1.1942052 T. Gold "The Arrow of Time",  American Journal of Physics '''30''' pp. 403-410 (1962)]
*[http://dx.doi.org/10.1119/1.1942052 T. Gold "The Arrow of Time",  American Journal of Physics '''30''' pp. 403-410 (1962)]
* Joel L. Lebowitz "Boltzmann's Entropy and Time's Arrow", Physics Today '''46''' pp. 32-38 (1993)
*[http://dx.doi.org/10.1063/1.881363 Joel L. Lebowitz "Boltzmann's Entropy and Time's Arrow", Physics Today '''46''' pp. 32-38 (1993)]
*[http://dx.doi.org/10.1023/A:1023715732166 Milan M. Ćirković "The Thermodynamical Arrow of Time: Reinterpreting the Boltzmann–Schuetz Argument", Foundations of Physics '''33''' pp. 467-490 (2003)]
*[http://dx.doi.org/10.1023/A:1023715732166 Milan M. Ćirković "The Thermodynamical Arrow of Time: Reinterpreting the Boltzmann–Schuetz Argument", Foundations of Physics '''33''' pp. 467-490 (2003)]
* Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press
*[http://dx.doi.org/10.1103/PhysRevE.79.061103 Noah Linden, Sandu Popescu, Anthony J. Short, and Andreas Winter "Quantum mechanical evolution towards thermal equilibrium", Physical Review E '''79''' 061103 (2009)]
Books:
* Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
* Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
*  Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1


==See also:==
==See also:==
*[[Entropy of a glass]]
*[[Entropy of a glass]]
*[[H-theorem]]
*[[Non-extensive thermodynamics]]
*[[Shannon entropy]]
*[[Shannon entropy]]
*[[Tsallis entropy]]
==References==
*[[H-theorem]]
<references/>
 
'''Related reading'''
==Interesting reading==
*[http://dx.doi.org/10.1119/1.1990592 Karl K. Darrow "The Concept of Entropy",  American Journal of Physics '''12''' pp.  183-196 (1944)]
*[http://dx.doi.org/10.1119/1.1971557 E. T. Jaynes "Gibbs vs Boltzmann Entropies",  American Journal of Physics '''33''' pp. 391-398 (1965)]
*[http://dx.doi.org/10.1119/1.1971557 E. T. Jaynes "Gibbs vs Boltzmann Entropies",  American Journal of Physics '''33''' pp. 391-398 (1965)]
*[http://dx.doi.org/10.1119/1.1287353 Daniel F. Styer "Insight into entropy",  American Journal of Physics '''86''' pp. 1090-1096 (2000)]
*[http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html S. F. Gull "Some Misconceptions about Entropy" in Brian Buck and Vincent A. MacAulay (Eds.) "Maximum Entropy in Action", Oxford Science Publications (1991)]
*[http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html S. F. Gull "Some Misconceptions about Entropy" in Brian Buck and Vincent A. MacAulay (Eds.) "Maximum Entropy in Action", Oxford Science Publications (1991)]
*[http://dx.doi.org/10.1119/1.1990592 Karl K. Darrow "The Concept of Entropy",  American Journal of Physics '''12''' pp. 183-196 (1944)]
*[http://dx.doi.org/10.2174/1874396X00802010007 Efstathios E. Michaelides "Entropy, Order and Disorder", The Open Thermodynamics Journal '''2''' pp. (2008)]
*[http://dx.doi.org/10.1119/1.1287353 Daniel F. Styer "Insight into entropy", American Journal of Physics '''86''' pp. 1090-1096 (2000)]
*Ya. G. Sinai, "On the Concept of Entropy of a Dynamical System," Doklady Akademii Nauk SSSR '''124''' pp. 768-771 (1959)
==References==
*[http://dx.doi.org/10.1063/1.1670348 William G. Hoover "Entropy for Small Classical Crystals", Journal of Chemical Physics '''49''' pp. 1981-1982 (1968)]
#Ya. G. Sinai, "On the Concept of Entropy of a Dynamical System," Doklady Akademii Nauk SSSR '''124''' pp. 768-771 (1959)
* Arieh Ben-Naim "Entropy Demystified: The Second Law Reduced to Plain Common Sense", World Scientific (2008) ISBN 978-9812832252
#[http://dx.doi.org/10.1063/1.1670348 William G. Hoover "Entropy for Small Classical Crystals", Journal of Chemical Physics '''49''' pp. 1981-1982 (1968)]
* Arieh Ben-Naim "Farewell to Entropy: Statistical Thermodynamics Based on Information",  World Scientific (2008) ISBN 978-981-270-707-9
[[category:statistical mechanics]]
* Arieh Ben-Naim "Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature" World Scientific Publishing (2010) ISBN: 978-981-4299-75-6
* Arieh Ben-Naim "Entropy and the Second Law Interpretation and Misss-Interpretations", World Scientific Publishing (2012) ISBN 978-981-4407-55-7
* Arieh Ben-Naim "Information, Entropy, Life and the Universe: What We Know and What We Do Not Know" World Scientific Publishing (2015) ISBN 978-981-4651-66-0
* Arieh Ben-Naim "Entropy The Truth, the Whole Truth, and Nothing But the Truth", World Scientific Publishing (2016) ISBN 978-981-3147-67-6
*[http://dx.doi.org/10.1063/1.4879553  Jose M. G. Vilar and J. Miguel Rubi "System-size scaling of Boltzmann and alternate Gibbs entropies", Journal of Chemical Physics '''140''' 201101 (2014)]
*[http://doi.org/10.1063/1.4972525 Misaki Ozawa and Ludovic Berthier "Does the configurational entropy of polydisperse particles exist?", Journal of Chemical Physics '''146''' 014502 (2017)]
*[http://dx.doi.org/10.1080/00268976.2016.1238523 Simin Yazdi Nezhad and Ulrich K. Deiters "Estimation of the entropy of fluids with Monte Carlo computer simulation", Molecular Physics '''115''' pp. 1074-1085 (2017)]
*[http://dx.doi.org/10.1063/1.4984965 Gérôme Faure, Rafael Delgado-Buscalioni, and Pep Español "The entropy of a complex molecule", Journal of Chemical Physics '''146''' 224106 (2017)]
 
==External links==
*[http://www.mdpi.com/journal/entropy entropy] an international and interdisciplinary Open Access journal of entropy and information studies.
*[http://dx.doi.org/10.4249/scholarpedia.3448 Joel L. Lebowitz "Time's arrow and Boltzmann's entropy", Scholarpedia, 3(4):3448 (2008)]
[[category: statistical mechanics]]
[[category: Classical thermodynamics]]

Latest revision as of 14:26, 17 January 2018

"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?). The word entropy originated from the Greek word meaning a turning or transformation "τροπή" [3].

Classical thermodynamics[edit]

In classical thermodynamics one has the entropy, ,

where is the heat and is the temperature.

Statistical mechanics[edit]

In statistical mechanics entropy is defined by

where is the Boltzmann constant, is the index for the microstates, and is the probability that microstate i is occupied. In the microcanonical ensemble this gives:

where (sometimes written as ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Tsallis entropy[edit]

Tsallis (or non-additive) entropy [4] is defined as (Eq. 1)

where is the Tsallis index [5]. As one recovers the standard expression for entropy. This expression for the entropy is the cornerstone of non-extensive thermodynamics.

Arrow of time[edit]

Articles:

Books:

  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:[edit]

References[edit]

Related reading

External links[edit]