## Aggregates of Atoms

The objects around us in our everyday life range in size from
millimeters to hundreds of meters. But molecules have sizes of less
than a billionth of a meter. Things that are 'our size' therefore
contain unimaginably large numbers of atoms (of the order of a
million, billion, billion or 10^{24}). How can we
understand the behavior of these enormous atom aggregates? Recall
that molecules interact - attract and repel each other - even when
they do not form chemical bonds. And we also saw how 'molecules'
can extend indefinitely in two or three dimensions - a topic taken
up in greater detail in later sections. But are there fundamental
guiding principles which allow us to understand how aggregates of
atoms and molecules behave?

One of the most profound and important principles which guide
our understanding of matter in aggregation is provided by the
science of thermodynamics which developed in the Nineteenth Century
in response to the engineering and scientific challenges posed by
the developing technology of heat engines - in particular the
question of the factors which limited their efficiency. And it is
one of the most fascinating features of the history of science that
this problem in applied science provoked the formulation of one of
the most philosophical, profound fields of science. Indeed,
considerable mystique has tended to surround the celebrated
*'Second Law'* of thermodynamics. However, the basic ideas in
thermodynamic theory are simple.

The 'First Law' has a variety of formulations, but they all
correspond to the principle of conservation of energy - the
statement that energy is never created or destroyed. Energy is, of
course, a key scientific concept which developed along with
thermodynamics. It is manifested in numerous forms including motion
(kinetic energy),position (potential energy),heat, chemical and
electrical energy (as discussed in further detail in the Appendix)
which can be interconverted. The usefulness of the concept of
energy arises from the principle of conservation which is simply a
statement to the effect that with regard to energy, nature operates
a "fixed exchange rate" policy: a given amount of electrical energy
will, for example, always be converted into the same amount of heat
energy and *vice versa.*

The second law is far more subtle and in its most general and
useful formulations is one of the most far reaching of scientific
concepts. Of the many different ways of stating the law, perhaps
the most simple and widely used application is as follows:
*'Systems, when considered in their totality, evolve towards a
state of increasing entropy.'* Entropy is a precisely definable
and measurable quantity which can be interpreted qualitatively in
terms of the degree of disorder in the system - a concept that can
be underwritten mathematically as we will see shortly. All the
other statements of the second law with which the reader may be
familiar can be derived from or are equivalent to this formulation.
For example, the well known (and correct) statement that "entropy
is time's arrow" merely emphasises that by "evolve" we mean evolve
in time.

Thermodynamics is a subject concerned with "macroscopic observables" but it is linked to the microscopic world of atoms and molecules by one of the most important equations of physics:

S = k log_{e}(W)

where *S* is the entropy of a system, *k* is a
constant (known as the Boltzmann constant after the original
formulator of the equation, Ludwig Boltzmann), *W* is the
"probability" of the system in its mathematical sense, that is the
number of distinct waysof arranging the atoms or molecules that are
consistent with the overall properties of the system. Systems that
are 'disordered' at the atomic level have higher probabilities and
hence higher entropies. Boltzmann's equation therefore gives
precision to the interpretation of entropy in terms of disorder; it
also unites the macroscopic science of thermodynamics with the
microscopic subject of *statistical mechanics* that explores
the statistical behavior of matter at the microscopic level.

The second law gives a verifiable and exact expression of one of the most basic features of nature - the drive towards states of increasing disorder. But ordered structures are common in nature. Indeed, we will describe the complex and ordered atomic structures present in crystals; and the science of crystallography is about ordered structures in three dimensions. Living matter is remarkably organized and its high level of organization is essential for its function. If the relationship between order and entropy means anything, then both crystals and living matter must be low entropy systems. How then do they survive in a universe which is constantly evolving to a state of higher entropy?

The key to this problem which has caused much confusion can be
found in our formulation of the second law which we recall referred
to systems considered *"in their totality"*. Crystals and
living matter are normally only part of a total system; they are
surrounded by an environment with which they exchange heat (and
possibly matter). The relationship is cleverly formalised in
thermodynamics where we focus on the component (the subsystem) of
the total system in which we are interested (say our crystal) and
consider the rest as a "heat bath" which can supply or withdraw
heat and which is characterised by its *temperature.* Next we
need to realize that ordered states commonly have low energies - a
point to which we return. And on passing from an ordered to a
disordered state, our 'subsystem' must absorb energy from its
surroundings. But when we withdrawenergy from the surroundings,
they become less disordered; that is, they lose entropy. So what
happens depends on the balance between the entropy change in the
subsystem and that in its surroundings. Again, the formulation of
thermodynamics theory allows us to deal with this complex problem
in a straightforward way. The loss of disorder, *i.e.* of
entropy, on withdrawing energy from the surroundings decreases with
the amount of energy in these surroundings; and the greater the
amount of energy in a body, the higher its temperature. So the
higher the temperature, the lower the entropy loss. Thermodynamics
expresses this intuitively obvious relationship by a precise
mathematical relationship: it says that the change in entropy (for
which we will use the symbols DS) of any system or subsystem (like
our "thermal bath") is related directly to the heat gain or loss
(represented by the symbol Q) but scaled by the inverse of the
temperature:

ΔS = Q / T

which is another of the handful of key equations in science. And
temperature in the context of thermodynamics is *defined* so
that this equation is true. Moreover, a consequence of this
thermodynamic definition of temperature is the concept of "absolute
zero" of temperature - a state in which the system has no energy
and at which classical physics would lead us to expect completely
motionless arrangements of atoms. However, one of the many bizarre
consequences of quantum mechanics is the failure of this classical
concept. Even when they have no energy, atoms must move, they have
*zero point* motion; if they did not, we would know exactly
where they were, contradicting the uncertainty principle. 'Absolute
zero' is a hypothetical state; it may never be achieved, although
low temperature physicists have got to within a few millionths of a
degree of this ideal 'energy less' state.

But to return to order and disorder, at low temperatures systems
tend to adopt an ordered state. As noted above, ordered states are
energy efficient. To achieve disorder, a state has to withdraw
energy from its surroundings which reduces *their* disorder.
As the temperature increases, this loss of disorder from the
surroundings becomes less and less, and our system will
increasingly pull in energy to achieve greater disorder for itself.
One of the most interesting and dramatic of such changes is the
process of melting. Crystalline solids are ordered arrangements of
atoms; liquids are characterised by disorder at the atomic level.
The process of melting - the conversion of the ordered crystalline
state into the disordered liquid - invariably requires an input of
energy (known as the latent heat of melting) to create the less
energy economical disordered state. The crystal melts at the
temperature at which the increase in entropy associated with
melting outweighs the loss of entropy of the surroundings
associated with the system withdrawing the latent heat.

The development of thermodynamic theory and the associated
subject of statistical mechanics represent another of the great
intellectual achievements of modern science. They allow us to
understand the fundamental factors controlling the behavior *in
aggregate* of the unbelievable numbers of atoms and molecules
present in the objects around us; they bridge the microscopic and
macroscopic worlds. And they allow us to understand the balance
between order and disorder in the universe. The most ordered of
objects are crystals.