.
H-theorem
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the increase in the entropy of an ideal gas in an irreversible process. The H-theorem follows from considerations of Boltzmann's equation. Claude Shannon denoted his measure of information entropy H after the H-theorem.[1]
It appears to predict an irreversible increase in entropy, despite microscopically reversible dynamics. This has led to much discussion.
Quantum mechanical H-theorem
In Quantum Statistical Mechanics (which is the quantum version of Classical Statistical Mechanics), the H-function is the function:[2]
\( H= \sum_i p_i \ln p_i, \, \)
where summation runs over all possible distinct states of the system, and pi is the probability that the system could be found in the i-th state.
This is closely related to the entropy formula of Gibbs,
\( S = - k \sum_i p_i \ln p_i \; \)
and we shall (following e.g., Waldram (1985), p. 39) proceed using S rather than H.
First, differentiating with respect to time gives
\( \begin{align} \frac{dS}{dt} & = - k \sum_i \left(\frac{dp_i}{dt} \ln p_i + \frac{dp_i}{dt}\right) \\ & = - k \sum_i \frac{dp_i}{dt} \ln p_i \\ \end{align} \)
(using the fact that ∑ dpi/dt = 0, since ∑ pi = 1).
Now Fermi's golden rule gives a master equation for the average rate of quantum jumps from state α to β; and from state β to α. For an isolated system the jumps will make contributions
\( \begin{align} \frac{dp_\alpha}{dt} & = \sum_\beta \nu_{\alpha\beta}(p_\beta - p_\alpha) \\ \frac{dp_\beta}{dt} & = \sum_\alpha \nu_{\alpha\beta}(p_\alpha - p_\beta) \\ \end{align} \)
where the reversibility of the dynamics ensures that the same transition constant ναβ appears in both expressions.
So
\( \frac{dS}{dt} = \frac{1}{2} k \sum_{i = \alpha,\beta} \nu_{\alpha\beta}(\ln p_{\beta}-\ln p_{\alpha})(p_{\beta}- p_{\alpha}). \)
But the two brackets will have the same sign, so each contribution to dS/dt cannot be negative.
Therefore
\( \Delta S \geq 0 \, \)
for an isolated system.
The same mathematics is sometimes used to show that relative entropy is a Lyapunov function of a Markov process in detailed balance, and other chemistry contexts.
H is a forerunner of Shannon's information entropy. The article on Shannon's information entropy contains a good explanation of the discrete counterpart of the quantity H, known as the information entropy or information uncertainty (with a minus sign). By extending the discrete information entropy to the continuous information entropy, also called differential entropy, one obtains the expression in Eq.(1), and thus a better feel for the meaning of H.
The H-theorem's connection between information and entropy plays a central role in a recent controversy called the Black hole information paradox.
Boltzmann's H-theorem
Starting with a function f that defines the number of molecules in small region of µ-space denoted by \delta q_1 ... \delta p_r
\( \delta n = f(q_1 ... p_r,t)\delta q_1 ... \delta p_r.\, \)
Tolman offers the following equations for the definition of the quantity H in Boltzmann's original H theorem.
\( H= \sum_i f_i \ln f_i \,\delta q_1 \cdots \delta p_r[3] \)
Here we sum over the i regions into which µ-space is divided.
This relation can also be written in integral form.
\( H= \int \cdots \int f_i \ln f_i \,d q_1 \cdots dp_r[4] \)
H can also be written in terms of the number of molecules present in the i cells.
\( \begin{align} H & = \sum( n_i \ln n_i - n_i \ln \delta v_\gamma) \\ & = \sum n_i \ln n_i + \text{constant} \end{align} \) [5][5]
An additional way to calculate the quantity H is:
\( H = -\ln P + \text{constant}\ \),[6]
Where P is the probability of finding a system chosen at random from the specified microcanonical ensemble
And can finally be written as:
\( H = -\ln G + \text{constant}\ \),[7]
where G may be spoken of as the number of classical states.
The quantity H can also be defined as the integral over velocity space[citation needed] :
\( \displaystyle H \ \stackrel{\mathrm{def}}{=}\ \int { P ({\ln P}) \, d^3 v} = \left\langle \ln P \right\rangle
(1) \)
where P(v) is the probability.
Using the Boltzmann equation one can prove that H can only decrease.
For a system of N statistically independent particles, H is related to the thermodynamic entropy S through:
\( S \ \stackrel{\mathrm{def}}{=}\ - N k H \)
so, according to the H-theorem, S can only increase.
However, Loschmidt objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong (Loschmidt's paradox). The explanation is that Boltzmann's equation is based on the assumption of "molecular chaos", i.e., that it is acceptable for all the particles to be considered independent and uncorrelated. This in fact breaks time reversal symmetry and therefore begs the question.
Analysis
This section may contain original research. Please improve it by verifying the claims made and adding references. Statements consisting only of original research may be removed. More details may be available on the talk page. (June 2008)
At the heart of the H-theorem is the replacement of 1-state to 1-state deterministic dynamics by many-state to many-state Markovian mixing, with information lost at each Markovian transition.
Gull is correct that, with the powers of Laplace's demon, one could in principle map forward exactly the ensemble of the original possible states of the N-particle system exactly, and lose no information. But this would not be very interesting. Part of the program of statistical mechanics, not least the MaxEnt school of which Gull is an enthusiastic proponent, is to see just how much of the detail information in the system one can ignore, and yet still correctly predict experimentally reproducible results.
The H-theorem's program of regularly throwing information away, either by systematically ignoring detailed correlations between particles, or between particular sub-systems, or through systematic regular coarse-graining, leads to predictions such as those from the Boltzmann equation for dilute ideal gases or from the recent entropy-production fluctuation theorem, which are useful and reproducibly observable. They also mean that we have learnt something qualitative about the system, and which parts of its information are useful for which purposes, which is additional beyond even the full specification of the microscopic dynamical particle trajectories.
(It may be interesting that having rounded on the H-theorem for not considering the microscopic detail of the microscopic dynamics, Gull then chooses to demonstrate the power of the extended-time MaxEnt/Gibbsian method by applying it to a Brownian motion example - a not so dissimilar replacement of detailed deterministic dynamical information by a simplified stochastic/probabilistic summary!)
However, it is an assumption that the H-theorem's coarse-graining is not getting rid of any 'interesting' information. With such an assumption, one moves firmly into the domain of predictive physics: if the assumption goes wrong, it may produce predictions which are systematically and reproducibly wrong.
See also
Loschmidt's paradox
Arrow of time
Second law of thermodynamics
Fluctuation theorem
Notes
^ Gleick 2011
^ Tolman 1938 pg 460 formula 104.7
^ Tolman 1938 pg. 135 formula 47.5
^ Tolman 1938 pg. 135 formula 47.6
^ a b Tolman 1938 pg. 135 formula 47.7
^ Tolman 1938 pg. 135 formula 47.8
^ Tolman 1939 pg. 136 formula 47.9
References
Lifshitz, E. M.; Pitaevskii, L. P. (1981). Physical Kinetics. Course of Theoretical Physics. 10 (3rd ed.). Pergamon. ISBN 0-08-026480-8.
Waldram, J. R. (1985). The Theory of Thermodynamics. Cambridge University Press. ISBN 0-521-28796-0.
Tolman, R. C. (1979). The Principles of Statistical Mechanics. Dover. ISBN 0-486-63896-0.
Gull, S. F. (1989). "Some Misconceptions about Entropy". In Buck, B.; Macaulay, V. A. Maximum Entropy in Action. Oxford University Press. 1991. ISBN 0-19-853963-0. Retrieved 2012-02-05.
Reif, F. (1965). Fundamentals of Statistical and Thermal Physics. McGraw-Hill. ISBN 978-0070518001.
Gleick, J. (2011). The Information: A History, a Theory, a Flood. Random House Digital. ISBN 978-0375423727.
Badino, M. (2011). "Mechanistic Slumber vs. Statistical Insomnia: The early history of Boltzmann's H-theorem (1868–1877)". European Physical Journal H. Bibcode 2011EPJH...36..353B. doi:10.1140/epjh/e2011-10048-5.
Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License