Entropy

From SklogWiki
Revision as of 14:26, 17 January 2018 by Carl McBride (talk | contribs) (Added origin of the word entropy)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?). The word entropy originated from the Greek word meaning a turning or transformation "τροπή" [3].

Classical thermodynamics

In classical thermodynamics one has the entropy, ,

where is the heat and is the temperature.

Statistical mechanics

In statistical mechanics entropy is defined by

where is the Boltzmann constant, is the index for the microstates, and is the probability that microstate i is occupied. In the microcanonical ensemble this gives:

where (sometimes written as ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Tsallis entropy

Tsallis (or non-additive) entropy [4] is defined as (Eq. 1)

where is the Tsallis index [5]. As one recovers the standard expression for entropy. This expression for the entropy is the cornerstone of non-extensive thermodynamics.

Arrow of time

Articles:

Books:

  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:

References

Related reading

External links