Entropy: Difference between revisions

From SklogWiki
Jump to navigation Jump to search
m (Started a Tsallis entropy section)
(Added equation for the Tsallis entropy)
Line 10: Line 10:
In [[statistical mechanics]] entropy is defined by
In [[statistical mechanics]] entropy is defined by


:<math>\left. S \right. = -k_B \sum_m p_m \ln p_m</math>
:<math>\left. S \right. := -k_B \sum_{i=1}^W p_i \ln p_i</math>


where <math>k_B</math> is the [[Boltzmann constant]], ''m'' is the index for the [[microstate |microstates]], and <math>p_m</math>
where <math>k_B</math> is the [[Boltzmann constant]], <math>i</math> is the index for the [[microstate |microstates]], and <math>p_i</math>
is the probability that microstate ''m'' is occupied.  
is the probability that microstate ''i'' is occupied.  
In the [[microcanonical ensemble]] this gives:
In the [[microcanonical ensemble]] this gives:


:<math>\left.S\right. = k_B \ln \Omega</math>
:<math>\left.S\right. = k_B \ln W</math>


where <math>\Omega</math> (sometimes written as <math>W</math>)
where <math>W</math> (sometimes written as <math>\Omega</math>)
is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system.
is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system.
This equation provides a link between [[Classical thermodynamics | classical thermodynamics]] and  
This equation provides a link between [[Classical thermodynamics | classical thermodynamics]] and  
[[Statistical mechanics | statistical mechanics]]
[[Statistical mechanics | statistical mechanics]]
==Tsallis entropy==
==Tsallis entropy==
<ref>[http://dx.doi.org/10.1007/BF01016429 Constantino Tsallis "Possible generalization of Boltzmann-Gibbs statistics", Journal of Statistical Physics '''52''' pp. 479-487 (1988)]</ref>
Tsallis entropy <ref>[http://dx.doi.org/10.1007/BF01016429 Constantino Tsallis "Possible generalization of Boltzmann-Gibbs statistics", Journal of Statistical Physics '''52''' pp. 479-487 (1988)]</ref> is defined as (Eq. 1)
 
:<math>S_q:= k_B \frac{1-\sum_{i=1}^W p_i^q}{q-1}</math>
==Arrow of time==
==Arrow of time==
Articles:
Articles:

Revision as of 11:58, 30 May 2014

"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step."
Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?).

Classical thermodynamics

In classical thermodynamics one has the entropy, ,

where is the heat and is the temperature.

Statistical mechanics

In statistical mechanics entropy is defined by

where is the Boltzmann constant, is the index for the microstates, and is the probability that microstate i is occupied. In the microcanonical ensemble this gives:

where (sometimes written as ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Tsallis entropy

Tsallis entropy [3] is defined as (Eq. 1)

Arrow of time

Articles:

Books:

  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:

References

Related reading

External links