sábado, 13 de febrero de 2010

Statistical Mechanics Entropy Revisited

In the study of heat, and of thermodynamics in general, there was no need to know the microscopic nature of the substance which was being heated or cooled and so on. Instead, we reasoned using general principles such as temperature, work, and energy conservation and explained a number of important phenomenon. We then introduced the idea of entropy, which is needed for explaining the physically observed irreversibility of the world around us. The Second Law of Thermodynamics was then shown to imply a number of new results, including the fact that there can never be a perpetual motion machine. The idea of energy is well understood from mechanics. To explain thermodynamics the new and profound idea of entropy had to be invented. It is the idea of entropy that is central and unique to the study of heat and thermodynamics, and the branch of physics called statistical mechanics has resulted from the attempt to understand entropy from a microscopic point of view. Recall from (9.41) we have a microscopic definition of entropy $S$ given by 


$\displaystyle \mathrm{Entropy}$ $\textstyle =$ $\displaystyle k \ln \mbox{\rm {(Number of configurations)}}$
$\displaystyle S$ $\textstyle =$ $\displaystyle k \ln \Gamma$

The microscopic definition unavoidably led to assumptions as to what matter is made out of, namely what is the microscopic composition of matter. For example, in applying the equation for entropy to the case of an ideal gas, we had to take into account the microscopic nature of the gas, in particular, that it is made out of an enormous collection of microscopic objects that we identified with atoms. The entire field of statistical mechanics was founded by Boltzmann in the late nineteenth century. As a historical aside, it is worth recording that it was in order to understand the concept of entropy from a microscopic point of view, that Boltzmann had postulated the existence of atoms well before their discovery in the twentieth century. In sum, the challenge posed by thermodynamics was the following: how can we reconcile ideas such as temperature, entropy and so on with the ideas of (Newtonian) mechanics? In particular, if any sample of matter that we observe in daily life is made out of an inordinately large number of atoms, approximately $N_{\mathrm{Avogardo}} \simeq 10^{23}$, how can we apply the laws of mechanics to this large collection of particles? Clearly, it is hopeless to try and describe how every single particle is moving, as this would involve specifying, at each instant, $N_{\mathrm{Avogardo}}$ number of positions and velocities. So what is the way out of this impasse?

Classical Statistical Mechanics

We know for a fact that all matter is composed out of small particles called atoms. Statistical mechanics is that branch of physics which explains the thermodynamic properties of nature starting from a microscopic point of view. In particular, we will attempt to apply classical mechanics to a large collection of particles, and in this way derive all the results of thermodynamics. We know that classical (Newtonian) mechanics cannot explain why atoms even exist, let alone explain its properties for which quantum mechanics is necessary. So can we, at all, classically analyze a large collection of atoms before understanding quantum mechanics? The answer, surprisingly enough, is yes. The reason is the following. In studying statistical mechanics, we will be concerned with the object's bulk (macroscopic) properties such as temperature, energy and so on. These properties result mainly from the interaction of atoms (and molecules) with each other. Recall atoms are electrically neutral, and are composed of a positively charged nucleus and negatively charged electrons which are distributed outside the nucleus. Let there be two atoms at positions ${\bf r_1}$ and ${\bf r_2}$. The distance between them is then given by $r=\vert{\bf r_1}-{\bf r_2}\vert$; the so called Lennard-Jones potential results from the quantum mechanical interaction of the charges and angular momentum that is carried by the atoms. The potential due to a typical atom or molecule is given by
\begin{displaymath} U_{LJ}(r)=U_0\{(\frac{R_0}{r})^6-(\frac{R_0}{r})^{12}\} \end{displaymath} (10.1)

where $U_0$ is a constant which depends on charge, and $R_0$ is the Lennard-Jones (LJ) radius, and is shown in Figure 10.1. Note that there is a minimum value in the inter-atomic potential at a distance of $R_0$ from the atom.

Figure 10.1: Lennard-Jones Potential
\begin{figure} \begin{center} %% \input{core/lj.eepic} \end{center} \end{figure}
As long as the atoms are moving slowly, and are farther away from each other than distance $R_0$, they can be treated as hard spheres of radius $R_0$ that behave as classical particles. For typical atoms and molecules the LJ-radius is around 3 to 5A (A= Angstrom =$10^{-10}m$). For example, for the argon atom, the LJ radius is 3.5A, and is 5A for a large molecule such as propane. However, in some cases the LJ radius is not suitable for determining the effective classical size of an atom. For example the $H_2$ molecule has an LJ radius of 0.7A, and is too small a distance to be taken as the classical radius of the $H_2$ molecule. As long as the object being analyzed is at temperatures and densities that are not very high or very low, the atoms are not squeezed together closer than the distance of the LJ radius, and we can treat the atoms as classical billiard balls. However at very low temperatures and high densities, this is not true and the classical analysis needs to replaced by quantum mechanics. At very high temperatures, the inner structure of the atoms, composed as it is out of a nucleus and electrons, needs to be taken into account, and requires an analysis which goes beyond classical mechanics.

Ensembles

We can now return to the problem at hand, namely that we have a collection of $N=N_{\mathrm{Avogardo}} \simeq 10^{23}$ number of classical particles, thought of as hard spheres of radius $R_0$, having mass $m$, and at temperature $T$. We would like to derive all the thermodynamic properties of the object in question starting from Newtonian mechanics. For the sake of concreteness, let us consider an ideal gas at temperature $T$, confined in a container of volume $V$, and let us further suppose that the gas is in equilibrium. By the gas being ideal, we mean that all the interactions of the particles which compose the gas can be ignored. The energy of the gas hence consists entirely of kinetic energy; let the three-dimensional velocity of the $n$-th particle be denoted by ${\bf v}_n=(v,u,w)$. Since there are $N$ particles, the total kinetic energy of the gas is simply the sum of the kinetic energies of the individual particles (atoms). Hence the energy of the gas is given by the following
\begin{displaymath} E_{GAS}=\frac{1}{2}m\sum_{n=1}^{N}{\bf v}_n^2 \end{displaymath} (10.2)

Recall that by equilibrium we mean that the gas has attained a state of maximum entropy, or equivalently, that there are no more changes of temperature and other state variable taking place. By the statement that the gas is at temperature $T$, we mean that the gas in question is in contact with a heat bath which is at a temperature $T$. The very fact that we have introduced the physical idea of temperature already implies that the gas is not an isolated system, but rather is part of a larger system which includes the heat bath and the object at a given temperature.

Figure 10.2: Gas in contact with a heat bath
\begin{figure} \begin{center} \epsfig{file=core/figure23.eps, height=4cm} \end{center} \end{figure}
How do we describe a gas, shown in Figure 10.2, composed out of $N \simeq 10^{23}$ particles, occupying a volume $V$ and at temperature $T$? There are simply too many particles to keep track of. To provide a mechanical description of the gas, we need to know the exact position and velocity of each and every particle, and which in general, is called a microstate of the system. A description of the microstate of any large object, containing about Avogardo's number of atoms, is in practice too difficult. And even more importantly, there is no need since the questions asked in thermodynamics do not refer to any single atom composing the gas, but rather, refer to the properties of the gas taken as a whole, called the bulk properties of the gas. We now make a major conceptual leap. We postulate that having a gas at a temperature $T$ means that the gas is not in a definite (mechanical) microstate state. Instead, all the various (mechanical) microstates states of the gas are now taken to occur with a certain probability. Hence, the description of the gas by its microstates, that is, by the detailed knowledge of the position and momentum of each and every atom of the gas, is replaced by an ensemble of microstates. An ensemble is a collection of all the possible microstates of the gas. The ensemble is called a microcanonical, canonical or grand canonical depending on the way that probabilities are assigned for the occurrence of the various microstate . We will return to this question is some detail in Section [*]. Since we know nothing of the microstates of the gas, the most consistent manner of assigning a probability of occurrence for the various allowed microstates is to assume that all the microstates of the gas are equally likely; this is how a microcanonical ensemble is defined. Our ignorance of the microstates is consequently given a complete expression in the microcanonical ensemble which is defined as follows. Given the parameters such as energy, volume and so on that specify the macroscopic properties of the gas, in the microcanonical ensemble all the microstates of the gas are equally likely. One should note that the idea of ensemble reflects our ignorance as to what is the microstate of the gas. The gas is inherently not in a probabilistic state, but rather it is our inability to determine its state which has led us to the ideas of probability, and to the idea of classical uncertainty. In quantum mechanics we will encounter uncertainty which is not a function of our ignorance, but rather, is an intrinsic property of nature. In the language of probability theory, the positions $x_n$ and velocities $v_n$ are all considered to be continuous random variables. In other words, the velocity of the particle has no definite value, but rather, its probability of occurrence is determined by the ensemble that describes it. We will denote by brackets the average value of a random variable. Hence, the average value of the kinetic energy of the $i$th particle is denoted by $<\frac{1}{2}m v_i^2>$. In Section 4 we will examine more closely how to calculate the average value of various physical quantities including kinetic energy.

Kinetic Theory of Gases

As the first application of the idea of ensembles, we discuss the kinetic theory of gases. This theory is applicable to a dilute gas that, to a good degree of accuracy, can be considered to be an ideal gas. Recall every atom of the gas is considered to be a free classical particle with mass $m$ and having a random velocity $v_i$. We will use the term atom and particle interchangeably.

Pressure

From the atomic point of view, how does pressure arise? Even before going into the detailed mechanism, we expect that pressure should be a macroscopic manifestation of microscopic motion. Consider a frictionless piston, which has an area $A$, contains a gas in some volume $V$ and with total number of atoms given by $N$. The piston is in equilibrium with the gas at some temperature $T$. Outside the piston is a perfect vacuum.

Figure 10.3: Gas Inside the Piston and Vacuum Outside
\begin{figure} \begin{center} \epsfig{file = core/figure16.eps, height=3cm, width=6cm} \end{center} \end{figure}
The atoms of the gas are constantly bombarding the piston and this will cause a force to be exerted on the piston, and to keep the piston in place (stationary) we need to counter this force. Consider for an atom traveling straight towards the piston with velocity ${\bf v}=(v_x,v_y,v_z)\equiv (v,u,w)$; we will show later the case of the atom moving with arbitrary velocity gives the same result. The particle hits the piston and bounces back. We assume that the collision is elastic in that the energy of the atom is the same before and after bouncing off the piston. The assumption of the collision being elastic means that the atom does not lose any of its energy to the piston. This is reasonable, since if it lost energy to the piston, the piston would heat up; and once the piston reached equilibrium with the gas, the assumption of collisions being elastic would be correct. As shown in Figure 10.4, for a wall placed along the $y$-axis, an elastic collision leads to a velocity ${\bf v'}=(-v_x,v_y,v_z)\equiv (-v,u,w)$. Hence, for simplicity we consider only the special case of ${\bf v}=(v,0,0)$. Let the velocity of the atom after the collision be ${\bf v'}=(v',0,0)$. Since the atom possesses only kinetic energy, we have from energy conservation
$\displaystyle \frac{1}{2}mv^2$ $\textstyle =$ $\displaystyle \frac{1}{2}mv'^2$ (10.3)
$\displaystyle \Rightarrow v'$ $\textstyle =$ $\displaystyle -v$ (10.4)


Figure 10.4: Diagram v to -v
\begin{figure} \begin{center} %% \input{core/figure25.eepic} \end{center} \end{figure}

Figure 10.5: Diagram v to -v
\begin{figure} \begin{center} \epsfig{file=core/figure24.eps, height=4cm} \end{center} \end{figure}
In other words, in colliding off the piston, the particle's velocity changed from $v$ to $-v$, and hence the momentum imparted to the piston is
\begin{displaymath} \mbox{\rm {Momentum imparted to piston}}=mv-m(-v)=2mv \end{displaymath} (10.5)

In time $t$, how many atoms will bounce off the piston? All the atoms with velocity $v$ can reach the piston in time $t$ if they are at a distance less than $x=vt$ from the piston. Consequently, all the atoms in a volume of size $xA=vtA$ will bounce off the piston. Hence, since the density (particles per unit volume) is $\displaystyle n=\frac{N}{V}$, in time $t$ the momentum imparted to the piston is
$\displaystyle \mbox{\rm {Momentum imparted to piston in time t}}$ $\textstyle =$ $\displaystyle 2mvxA\times n$ (10.6)
$\textstyle =$ $\displaystyle 2\frac{Nmv^2A}{V}t$ (10.7)

Since the force on the piston is nothing but the rate at which momentum changes on the piston due to collisions of the gas atoms, we have
$\displaystyle \mbox{\rm {Force}}$ $\textstyle =$ $\displaystyle \frac{\mbox{\rm {Momentum imparted to piston in time t}}}{t}$ (10.8)
$\displaystyle \Rightarrow F$ $\textstyle =$ $\displaystyle 2\frac{Nmv^2A}{V}$ (10.9)

Recall pressure $P$ is defined to be force per unit area, and hence the pressure on the piston due to the gas is

$\displaystyle \mbox{\rm {Pressure}}$ $\textstyle =$ $\displaystyle \mbox{\rm {Force per unit area}}$ (10.10)
$\textstyle =$ $\displaystyle \frac{F}{A}$ (10.11)
$\displaystyle \Rightarrow P$ $\textstyle =$ $\displaystyle 2\frac{Nmv^2}{V}$ (10.12)

From the ensemble point of view, the velocity $v$ of the atom is a random variable, and what the piston really experiences is the average value over all possible velocities that the atoms has as it bounces off the piston. Hence, denoting as usual average values by $<>$, we have
\begin{displaymath} PV=N<mv^2> \end{displaymath} (10.13)

The reason we have dropped the factor of $2$ in going from (10.12) to (10.13) is that we need to perform the average over only those particles which are heading towards the piston and not away from it. Since we are taking the average value of $v^2$, we are over-counting by a factor of $2$ since we are also including the particles moving away from the piston. Recall we had considered a very special set of velocities, namely, those heading straight for the piston, and hence with ${\bf v}=(v,0,0)$. In general, the velocity of an arbitrary atom has the form ${\bf v}=(v,u,w)$. Since all directions for the gas are equivalent, we have

$\displaystyle <v^2>=<u^2>=<w^2>$ (10.14)
$\displaystyle \Rightarrow <{\bf v}^2>=<v^2>+<u^2>+<w^2>=3<v^2>$ (10.15)

We finally have, from eq.(10.13), the following
\begin{displaymath} PV=\frac{1}{3}N<m{\bf v}^2> \end{displaymath} (10.16)

The total energy, $U$ of the gas is solely composed of kinetic energy. Hence we have
\begin{displaymath} U= N\frac{1}{2}m<{\bf v}^2> \end{displaymath} (10.17)

From (10.13) and (10.15) and (10.17) we have
$\displaystyle PV=\frac{2}{3}U$ (10.18)

Temperature

We have so far been able to define both the pressure $P$ and energy $U$ of the gas from the atomic point of view. We now need to define temperature. Consider two gases in a cylinder separated by a frictionless piston. When the piston reaches equilibrium there is no further change in the system. We hence conclude that the temperature of both of the gases must be the same, since the very definition of temperature is that there will be heat flows, and consequently, no equilibrium, unless and until the temperatures of the two gases becomes equal.

Figure 10.6: Piston Separating out Two Gases
\begin{figure} \begin{center} \epsfig{file=core/figure26.eps, height=5cm} \end{center} \end{figure}
We now examine the conditions under which there will be equilibrium. Let us label the gas on the left of the cylinder as $1$ and that on the right as $2$. For equilibrium, the pressure exerted by both the gases on the piston must be equal. Hence, from (10.16) we have
\begin{displaymath} \frac{N_1}{3V_1}<m_1{\bf v_1}^2>=P= \frac{N_2}{3V_2}<m_2{\bf v_2}^2> \end{displaymath} (10.19)

Are the densities of the two gases the same on two sides of the piston, that is $\displaystyle \frac{N_1}{V_1}$ equal to $\displaystyle \frac{N_2}{V_2}$? The answer is yes, although to prove this is quite difficult. The intuitive proof that the two densities are equal is that if there was a difference in the densities, there would be a net ``osmotic'' pressure on the piston forcing it to move, and consequently the system would not be in equilibrium. Hence, in equilibrium, we have
\begin{displaymath} \frac{N_1}{V_1}=\frac{N_2}{V_2} \end{displaymath} (10.20)

and from (10.19) we obtain
\begin{displaymath} <\frac{1}{2}m_1{\bf v_1}^2>=<\frac{1}{2}m_2{\bf v_2}^2> \end{displaymath} (10.21)

We see that the equation above is simply a statement that the average kinetic of the atoms in two gases which are in equilibrium is the same. Hence temperature is defined to be proportional to the average kinetic energy of a single atom of the gas. Fixing the constant of proportionality to be the Boltzmann constant we finally arrive at the following definition of temperature $T$.
\begin{displaymath} \frac{3}{2}k_BT\equiv <\frac{1}{2}m{\bf v}^2> \end{displaymath} (10.22)

Temperature is a measure of how fast, on the average, that the atoms of a gas are moving. At room temperature $k_BT \simeq \frac{1}{40}$ eV. The faster the atoms move, the hotter the temperature. The sensation of burning that we have on putting our hands into, say a fire, is because fast moving atoms from the fire impart high amounts of kinetic energy to our hands, causing atoms in our hand to move very fast and result in the sensation of burning. Hence, from eqns.(10.2) and (10.22)we have
$\displaystyle U=<E_G>$ $\textstyle =$ $\displaystyle \frac{1}{2}m\sum_{n=1}^{N}<{\bf v}_n^2>$ (10.23)
$\textstyle =$ $\displaystyle Nk_BT$ (10.24)

Combining our results, from (10.16) and (10.22) we finally obtain the ideal gas law
\begin{displaymath} PV=Nk_BT \end{displaymath} (10.25)

The result above has the remarkable implication that no matter what the gas is composed of, for example, be it nitrogen, helium and so on, equal volumes of the various gases at the same pressure and temperature have the same number N of atoms. Note this result follows directly from Newton's Laws as is seen by the derivation given. This remarkable property of the ideal gas led Avoagardo to postulate that one molar volume of any gas will have the same number of atoms, given by Avagardo's number $N_{\mathrm{Avogardo}}$.

http://srikant.org/core/node11.html

Héctor A. Chacón C.

No hay comentarios:

Publicar un comentario