Entropy and the relentless drift from order to chaos

It is all too evident in everyday life: a broken egg cannot be put together again

In a famous lecture in 1959, scientist and author C P Snow spoke of a gulf of comprehension between science and the humanities, which had split into “two cultures”.

Many people in each group had a lack of appreciation of the concerns of the other group, causing grave misunderstandings. Snow compared ignorance of the second law of thermodynamics to ignorance of Shakespeare.

So, what is this mysterious second law? Put simply, it says that a physical system moves towards, or remains in, the most probable state. In fact, this is not really mysterious at all. Toss 10 coins in the air. Typically, about half will come down heads and half tails. Were all 10 to land heads up, suspicion would be aroused: the chance of this is about one in a thousand.

The key factor is the “multiplicity”, the number of ways an event can happen. There is only one way to get 10 heads: every single coin must be a head. But five heads can be any five of the 10 coins, and there about 250 ways this can happen. In the jargon of statistical mechanics, we call each of these ways a microstate, and there are 250 microstates in the macrostate of five heads out of 10. Thus, five heads coming up is 250 times more likely than 100 heads.

READ MORE

Breathtakingly large

The situation becomes simpler as the numbers become larger. With 100 coins, the chance of all being heads is so minute that we can take it as zero. For a volume of gas, the numbers are breathtakingly large. A cubic metre box of air has about 24 trillion trillion molecules, distributed uniformly throughout the box. The chance that they would all move to the left side of the box, leaving a vacuum on the right is quite beyond remote: it never happens!

We learn at school that the logarithm of the product to two numbers is the sum of the logarithms of the numbers: logarithms turn multiplication into addition. Logs are fundamental in mathematics and physics. In maths, the distribution of prime numbers follows a logarithmic law. Many physical phenomena are also governed by log laws, most notably the second law of thermodynamics.

The Austrian physicist, Ludwig Boltzmann, struggled to explain the behaviour of gases using the ideas of statistical mechanics. For any observed macrostate – a specific temperature and pressure – there are a truly enormous number of microstates; the multiplicity W is vast.

Boltzmann knew that, for two combined systems, the total energy is the sum of the two component energies. He wanted a quantity for macrostates that had this additive property. So, in 1877 he took the logarithm of the multiplicity, defining the entropy as S = k log W (k is called Boltzmann’s constant).

Molecular disorder

Entropy is a measure of molecular disorder. It governs the direction of irreversible physical processes that, left to their own devices, tend from order to chaos.

The awful truth is that, in an isolated system, entropy never gets less. This is what the second law of thermodynamics states. It is all too evident in everyday life: a broken egg cannot be put together again. We understand this without mathematics, but Boltzmann’s equation makes it sharp and inescapable.

Boltzmann knew that matter is made of myriad atoms. This was disputed by many authorities and he encountered severe criticism. It all became too much and in 1906, aged just 62, he hanged himself. His equation S = k log W is carved on his tombstone.

Peter Lynch is emeritus professor at the School of Mathematics & Statistics, University College Dublin – he blogs at thatsmaths.com