From SklogWiki
Jump to navigation Jump to search
"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?). The word entropy originated from the Greek word meaning a turning or transformation "τροπή" [3].

Classical thermodynamics[edit]

In classical thermodynamics one has the entropy, ,

where is the heat and is the temperature.

Statistical mechanics[edit]

In statistical mechanics entropy is defined by

where is the Boltzmann constant, is the index for the microstates, and is the probability that microstate i is occupied. In the microcanonical ensemble this gives:

where (sometimes written as ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Tsallis entropy[edit]

Tsallis (or non-additive) entropy [4] is defined as (Eq. 1)

where is the Tsallis index [5]. As one recovers the standard expression for entropy. This expression for the entropy is the cornerstone of non-extensive thermodynamics.

Arrow of time[edit]



  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:[edit]


Related reading

External links[edit]