Statistical Physics: A Probabilistic Approach
Suitable for graduate students in chemical physics, statistical physics, and physical chemistry, this text develops an innovative, probabilistic approach to statistical mechanics. The treatment employs Gauss's principle and incorporates Bose-Einstein and Fermi-Dirac statistics to provide a powerful tool for the statistical analysis of physical phenomena.
The treatment begins with an introductory chapter on entropy and probability that covers Boltzmann's principle and thermodynamic probability, among other topics. Succeeding chapters offer a case history of black radiation, examine quantum and classical statistics, and discuss methods of processing information and the origins of the canonical distribution. The text concludes with explorations of statistical equivalence, radiative and material phase transitions, and the kinetic foundations of Gauss's error law. Bibliographic notes complete each chapter.
1123664682
Statistical Physics: A Probabilistic Approach
Suitable for graduate students in chemical physics, statistical physics, and physical chemistry, this text develops an innovative, probabilistic approach to statistical mechanics. The treatment employs Gauss's principle and incorporates Bose-Einstein and Fermi-Dirac statistics to provide a powerful tool for the statistical analysis of physical phenomena.
The treatment begins with an introductory chapter on entropy and probability that covers Boltzmann's principle and thermodynamic probability, among other topics. Succeeding chapters offer a case history of black radiation, examine quantum and classical statistics, and discuss methods of processing information and the origins of the canonical distribution. The text concludes with explorations of statistical equivalence, radiative and material phase transitions, and the kinetic foundations of Gauss's error law. Bibliographic notes complete each chapter.
29.95 In Stock
Statistical Physics: A Probabilistic Approach

Statistical Physics: A Probabilistic Approach

by Bernard H. Lavenda
Statistical Physics: A Probabilistic Approach

Statistical Physics: A Probabilistic Approach

by Bernard H. Lavenda

eBook

$29.95 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

Suitable for graduate students in chemical physics, statistical physics, and physical chemistry, this text develops an innovative, probabilistic approach to statistical mechanics. The treatment employs Gauss's principle and incorporates Bose-Einstein and Fermi-Dirac statistics to provide a powerful tool for the statistical analysis of physical phenomena.
The treatment begins with an introductory chapter on entropy and probability that covers Boltzmann's principle and thermodynamic probability, among other topics. Succeeding chapters offer a case history of black radiation, examine quantum and classical statistics, and discuss methods of processing information and the origins of the canonical distribution. The text concludes with explorations of statistical equivalence, radiative and material phase transitions, and the kinetic foundations of Gauss's error law. Bibliographic notes complete each chapter.

Product Details

ISBN-13: 9780486815206
Publisher: Dover Publications
Publication date: 08/01/2016
Series: Dover Books on Physics
Sold by: Barnes & Noble
Format: eBook
Pages: 384
File size: 31 MB
Note: This product may take a few minutes to download.

About the Author

Bernard H. Lavenda was Professor of Chemical Physics at Italy's University of Camerino. His other books include A New Perspective on Reality: An Odyssey in Non-Euclidean Geometry and Where Physics Went Wrong.

Read an Excerpt

Statistical Physics

A Probabilistic Approach


By Bernard H. Lavenda

Dover Publications, Inc.

Copyright © 1991 Bernard Lavenda
All rights reserved.
ISBN: 978-0-486-81520-6



CHAPTER 1

Entropy and Probability


1.1 The Predecessors of Boltzmann

There are two basic categories of thermodynamic theories: phenomenological and probabilistic. Chronologically, the former precedes the latter because thermodynamics evolved from the observations made on steam engines by engineers, like Sadi Carnot, at the beginning of the nineteenth century. These observations were formalized into principles by physicists, like Rudolf Clausius, during the middle part of the nineteenth century. According to these principles, "heat is energy" (first law) and "heat flows spontaneously from hot to cold" (second law). According to Clausius, the first principle can be phrased as "the energy of the universe is constant" while the second law introduces the abstract concept of "entropy" in which the entropy of the universe tends to a maximum. It is precisely this property of the entropy that had apocalyptic consequences since it predicted that the universe would end in a heat death caused by thermal interactions that lead to an unending increase in entropy.

But in what sense is a system ever "left to itself" or completely isolated? For if it were completely isolated, there would be no way for the energy, or for that matter any other thermodynamic variable, to change and, consequently, the entropy could not increase. If there would be some means by which we could alter the energy or the other thermodynamic parameters necessary to specify the state of the system, the entropy could be made to vary at will and it would therefore violate the second law of thermodynamics, as formulated by Clausius.

By the turn of the century it became evident that there was something incomplete about Clausius' formulation. There began a search for an alternative approach that would avoid such cataclysmic consequences. This alternative approach became known as the "statistical" or "probabilistic" formulation. It asserts that "heat is a form of random molecular motion" and "entropy is a measure of disorder or the probability of realizing a given macroscopic state in terms of the number of microscopic 'complexions' that are compatible with it." Equilibrium would be that state with maximum probability or, equivalently, one with maximum entropy. In contrast to the phenomenological formulation, it would not be a static final resting state of the system but, rather, thermal equilibrium would be characterized by a distribution of constantly fluctuating configurations.

How can this be brought about in a truly isolated system? Planck's response was that

no system is ever truly isolated; there is always an exchange of energy with the outside world, no matter how small or irregular. This exchange allows us to measure the temperature of the system. The "loose coupling" between system and environment allows the system to change its state — albeit on a rather irregular basis.


The consequence of this was to lower the second law, from the status of an absolute truth to one of high probability. However, it still left open the question of how to determine the entropy from molecular considerations which necessarily had to agree with the phenomenological expression for the entropy. Since the molecular variables are necessarily random, the resulting entropy which is a function of those variables would also be a random quantity. It thus appeared that one abstract definition of entropy was being substituted for another.

Ideally, one would like to begin with the dynamics of large assemblies of molecules and show how the system evolves to its final state. However, as Boltzmann realized, these reversible laws must always be completed by some hypothesis of randomness. More recently, it has been shown that randomness can be introduced in a deterministic framework by going to an appropriate asymptotic limit, such as the Brownian motion limit. Since nature never goes to such limits,1 our real interest lies in the random hypothesis. However, once the random hypothesis is introduced, the link with the reversible world of molecular motions has been severed. With no direct link-up with the kinetic foundations, we are led to focus our attention on the elements of randomness. By concentrating on the nature of the randomness itself, we are able to avoid entering the reversible microscopic world of atoms and treat the aspect of irreversibility which appears at the macroscopic stage. This is the inception of a probabilistic formulation of thermodynamics, devoid of any particular dynamical feature that may filter through to the macroscopic world from the microscopic one. The stage has now been set for Boltzmann's contribution to this probabilistic formulation.


1.2 Boltzmann's Principle

The following inscription (in our notation),

S = k ln ω, (1.1)

relating the entropy S to the logarithm of the so-called thermodynamic probability ω is engraved on Boltzmann's tombstone. The constant of proportionality in Boltzmann's principle is k? Planck did not at all appreciate k being referred to as Boltzmann's constant since it was he who discovered it. According to Planck, "Boltzmann never calculated with molecules but only with moles" and it therefore never occurred to him to introduce such a factor. Planck was preoccupied with the absolute nature of the entropy, and without the factor of proportionality between the entropy S and thermodynamic probability ω, there would necessarily appear an undetermined additive constant in (1.1). Whereas Boltzmann considered the enumeration of the microscopic complexions belonging to a macroscopic state to be an "arithmetical device of a certain arbitrary character," Planck was to learn that it certainly was not.

This universal constant, Planck contended, "is the same for a terrestrial as for a cosmical system, and when it is known for one, it is known for the other; when k is known for radiant phenomena, it is also known and is the same for molecular motions." The importance of Boltzmann's principle cannot be overestimated since it led the way toward the theory now known as statistical mechanics.

Early in Boltzmann's career, he thought that a theory of heat could be reduced to a purely mechanical interpretation. The years between 1869 and 1872 saw a large infusion of probabilistic notions in Boltzmann's predominantly mechanistic view. The culmination came in 1872 with the first enunciation of what will later be known as the "H-theorem." Slowly, he became converted to the idea that the full content of the second law will only be grasped when its roots are sought in the theory of probability, abandoning his earlier belief that thermodynamics can be reduced to mechanics. Boltzmann asserted that "if the initial distribution amongst the bodies did not correspond to the laws of probability, it will tend increasingly to become so." The entropy will be a measure of the tendency of the system to become allied with these probabilistic laws. He argued that it cannot be fortuitous that, in its most probable state, the velocities of a very large aggregate of gas molecules possess the same distribution as the "errors of observation that always creep in when the same quantity is repeatedly determined by measurement under the same conditions."

The very sharpest definition of a macroscopic state we have is the number of its microscopic complexions that are compatible with it. The most probable state of the system is one in which the number of microscopic complexions is the greatest; this corresponds to the state of maximum entropy or the most "disorderly" state and coincides with the thermodynamic notion of equilibrium. It may be said that Clausius elevated the second law to an absolute truth for which the entropy of a composite system, obtained by bringing two isolated systems into thermal contact, cannot be smaller than the sum of the entropies of the individual systems, while Boltzmann lowered it to one of high probability. To quote Gibbs, "The impossibility of an uncompensated decrease of entropy seems to be reduced to an improbability."

Boltzmann transferred his attention from the fatalistic implications of the second law, where energy is continually being degraded into "purely thermal vibrations [which] slip through our hands and escape our senses and which for us is synonymous with rest," to the amazing uniformity of the final state for which the law of large numbers had to be responsible. Boltzmann drew the analogy with a probability that a large number of people should be found in complete agreement. Such a circumstance is not impossible, he claimed, although it is highly improbable: it simply would not conform to the laws of probability. Every dissension would amount to a degradation of energy, and in the state of complete disagreement, the "degraded energy forms will be none but the most probable forms; or better, it will be the energy that is distributed amongst the molecules in the most probable way." The final distribution conforms to the laws of probability where deviations from the most probable values will be governed by laws of error, the statistical independence of the observations being completely harmonious with the thermodynamic property of additivity.

It took Planck a period of years to abandon Clausius' interpretation and to reconcile his ideas with those of Boltzmann. Planck's conversion was due to the simple fact that it was only Boltzmann's approach which provided a theoretical basis for the expression of the entropy of black radiation that Planck had so luckily guessed and which fitted the data remarkably well — perhaps too well for it to be a lucky guess! It was one of those macabre twists of fate that Boltzmann did not live to see the fruit of his efforts, for which he struggled so laboriously in his lifetime to gain universal acceptance.

However, the unity of thermodynamics and probability theory afforded by Boltzmann's relation (1.1) brought with it a certain opaqueness to the concept of probability. For one thing, the "thermodynamic" probability is an integer— and a large integer at that—so that it cannot be considered a probability (which, of course, must be a proper fraction) at all. Boltzmann's argument that the actual number of objects is a more perspicuous concept than a mere probability is not convincing. If ω represents the number of microscopic complexions corresponding to a macroscopic state, then we might try to divide ft by the total number of complexions in order to get a proper fraction which can represent a probability. Apart from the fact that it does not lead to the correct thermodynamic result, not any value of ω can be related to the thermodynamic entropy but rather its maximum value, which has been determined by imposing the constraints that the total number of particles and total energy are constant. Since the thermodynamic entropy coincides with the state of thermodynamic equilibrium, for which there is an overwhelmingly large number of indistinguishable microstates, the ratio of the maximum of ω to its total value is effectively unity, giving a trivial result for the entropy.

We may, however, ask for the probability of any given state. This probability may be represented as the fraction representing the ratio of the statistical weight of this state, ω, to the sum of the statistical weights of all the macroscopic states that are compatible with the given constraints. Since the ratio of the maximum of ω to its total value is effectively unity, we may replace the sum in the denominator by the maximum of fl, which we shall denote by ωmax. The probability of any state whose statistical weight is ω is thus

ω/ωmax = exp (S - Smax),


where Smax is the maximum value of the entropy. It is this form of Boltzmann's principle that was used so successfully by Einstein in his study of thermodynamic fluctuations.

In a sense, Einstein "inverted" Boltzmann's principle (1.1) to obtain information regarding fluctuations about the state of equilibrium. According to Planck in his Theory of Heat, the entropy difference,

ΔS = S - Smax,


is negative, and introducing this into the above expression, we get

ω/ωmax = exp (ΔS). (1.2)


Since ΔS is negative, this is a proper fraction, as it should be. If we reinstate Boltzmann's constant, it becomes apparent that due to the smallness of k, moderate values of ΔS will lead to vanishing small probabilities so large deviations from equilibrium will be extremely rare. It is only when the deviation from equilibrium, |ΔS|, is of the same order of magnitude as Boltzmann's constant will there be a finite probability for fluctuations to occur. These fluctuations will necessarily have an exceedingly small amplitude unless conditions for magnifying their effect are present.

In the atmosphere, for example, small adjacent regions of widely different densities act as scattering centers for light. Scattering will be appreciable only when the wavelength of the light is of the same order as the dimension of the region. Since large fluctuations are more probable in small regions, blue light is more likely to be scattered than red light.

Another situation in which fluctuations will be observable is the irregular motion of small particles caused by the chaotic thermal motions of the lighter surrounding molecules. This is the phenomenon known as Brownian motion. Also, one should expect nonnegligible fluctuations in first-order phase equilibria due to large differences in the densities of the two phases. In the phenol- water system, a curious phenomenon is observed as the critical temperature is approached from above. Immediately before the single homogeneous phase splits up into two phases, a milky iridescence appears. This critical opalescence is caused by the scattering of light from neighboring regions having slightly different densities. It was this phenomenon which was analyzed by Einstein in 1910 on the basis of his formula (1.2) for a spontaneous fluctuation [cf. [section]1.11].

Einstein's formula, (1.2), allow us to determine the relative frequency that deviations from equilibrium occur. To implement it, we could attempt a Taylor series expansion of S about equilibrium, and if the fluctuations are small, we could neglect higher than second-order terms. Since the entropy is a maximum at equilibrium, the coefficient of the first-order term would vanish and we would be left with a negative quadratic form in the exponent. The negativity of the (straight) coefficients of the quadratic terms comprise thermodynamic stability criteria.

The problem with the above proposal is that the entropy change is not that of the system but rather the total entropy change of system -f reservoir. The usual form of the entropy maximum principle is

The entropy tends to a maximum for a given value of the total energy.

Intended in this statement is that the entropy be a maximum with respect to some internal parameter for which the first derivative vanishes. But we are dealing with the extensive variables consisting of internal energy E, volume V, and number of particles N. And the derivative of the entropy with respect to any one of these variables does not vanish at equilibrium; rather, it defines the conjugate intensive variable. So a naive Taylor series expansion will certainly not do. And certainly, Planck and his contemporaries did not make such a banal error.

Rather, they considered a composite system in which two subsystems are separated by a thin piston, occupying no volume, which is a good conductor of heat, is impermeable to the passage of matter, and moves without friction. Let the suffixes 1 and 2 denote the subsystems. For fixed subvolumes, we can ask for the probability that the energy of subsystem 1 is between E1 and E1 + dE1 at the same time the energy of subsystem 2 is between E2 and E2 + dE2. The "probability" for the individual subsystems will be given by ω1(E1, V1) dE1 and ω2(E2, V2)dE2 so that the combined probability should be given by

ω1(E1, V1)ω2(E2, V2) dE1dE2. (1.3)

This would be the case if we were considering independent events and a realization of the energy in subsystem 1 had no effect upon a realization of the energy in 2. But the composite system is isolated from the rest of the world and this implies energy conservation, namely, E1 + E2 = E = const. So E1 and E2 are not independent random variables, and the joint probability will not reduce to the product of the individual probabilities as (1.3) implies.


(Continues...)

Excerpted from Statistical Physics by Bernard H. Lavenda. Copyright © 1991 Bernard Lavenda. Excerpted by permission of Dover Publications, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Prologue1. Entropy and Probability2. Black Radiation: A Case History3. From One to Infinity4. Processing Information5. Origins of the Canonical Distribution6. Statistical Equivalence Principle7. Radiative and Material Phase Transitions8. Kinetic Foundations of Gauss' Error LawBibliographic NotesIndex
From the B&N Reads Blog

Customer Reviews