We have argued on an abstract basis that the state of highest entropy (and hence the most probable state) for any complicated system is the one whose macroscopic properties can be obtained in the largest possible number of different ways; if the model systems we have considered are any indication, a good rule of thumb for how to do this is to let each ``degree of freedom'' of the system contain (on average) an equal fraction of the total energy U. We can justify this argument by treating that degree of freedom as a ``system'' in its own right (almost anything can be a ``system'') and applying Boltzmann's logic to show that the probability of that microsystem having an energy while in thermal equilibrium at temperature decays exponentially as . This implies a mean on the order of , if we don't quibble over factors comparable to 1.
The Equipartition Theorem, which is more rigourously valid than the above hand-waving would suggest,15.25 specifies the factor to be exactly 1/2:
A system in thermal equilibrium with a heat reservoir at temperature will have a mean energy of per degree of freedom.
In an ideal monatomic gas of N atoms at temperature
each atom has three degrees of freedom:
left-right (x), back-forth (y) and up-down (z).
Thus the average internal energy of our monatomic ideal gas is
It also means that if we change the temperature
of a container of gas, the rate of change
of the internal energy U with temperature,
which is the definition of the HEAT CAPACITY
(15.19) |
(15.20) |
Now let's examine our gas from a more microscopic, ``mechanical'' point of view: picture one atom bouncing around inside a cubical container which is a length L on a side. In the ``ideal'' approximation, atoms never hit each other, but only bounce off the walls, so our consideration of a single atom should be independent of whether or not there are other atoms in there with it. Suppose the atom in question has a velocity with components vx, vy and vz along the three axes of the cube.
Thinking only of the wall at the +x end of the box,
our atom will bounce off this wall at a rate 1/t
where t is the time taken to travel a distance
2L (to the far wall and back again)
at a speed vx:
t = 2L/vx.
We assume perfectly elastic collisions
-- i.e. the magnitude of vx does not change when
the particle bounces, it just changes sign.
Each time our atom bounces off the wall in question,
it imparts an impulse of 2 m vx to that wall.
The average impulse per unit time (force)
exerted on said wall by said atom is thus
F1 = 2 m vx/t or
F1 = m vx2/L.
This force is (on average) spread out
all over the wall, an area A = L2,
so that the force per unit area (or pressure)
due to that one particle is given by
p1 = F1/A = m vx2/L3.
Since L3 = V, the volume of the container, we can write
p1 = m vx2/V or
Now, the kinetic energy of our original atom is explicitly given by
Combining Eqs. (21) and (23), we obtain
the famous IDEAL GAS LAW:
(15.24) |
Despite the flimsiness of the foregoing arguments, the IDEAL GAS LAW is a quantum mechanically correct description of the interrelationship between the pressure p, the volume V and the temperature of an ideal gas of N particles, as long as the only way to store energy in the gas is in the form of the kinetic energy of individual particles (usually atoms or molecules). Real gases can also store some energy in the form of rotation or vibration of larger molecules made of several atoms or in the form of potential energies of interaction (attraction or repulsion) between the particles themselves. It is the latter interaction that causes gases to spontaneously condense, below a certain boiling point Tb, into liquids and, at a still lower temperature Tm (called the melting point), into solids. However, in the gaseous phase even carbon [vaporized diamond] will behave very much like an ideal gas at sufficiently high temperature and low pressure. It is a pretty good Law!