Why is a second a second long? Let me explain – backwards, if I may

If we could not measure then we could not build cities, trade goods, or carry out scientific experiments

Engineer Hans Hilfiker designed the Swiss rail clock to complete its revolution in 58.5 seconds then stand still for 1.5 seconds more to, as he put it, “bring calm in the last moment and ease punctual train departure”. Photograph: Fabrice CoffrinI/AFP/Getty Images

Of all the disciplines that enable human flourishing, measurement is perhaps the most overlooked. It’s a practice that can be traced back to the world’s oldest civilisations and has since become interwoven with everyday life. If we could not measure then we could not build cities, trade goods, or carry out scientific experiments. Yet we often take for granted those units of length and weight that make up this global language. We don’t often stop to ask ourselves, why is a meter a meter? Why an inch an inch?

For each unit, though, there is a fascinating history, that explains not only why seemingly arbitrary lengths and weights have been retained over the millennia, but why measurement itself is so important to our world.

Meter

No unit reveals the political importance of measurement more than the meter. It was first defined during the French Revolution, when the country’s intellectual elite, the savants, set out to create a system of measurement that would embody revolutionary ideals of universality and rationality.

France in the 19th century was burdened by a confusion of measures that encouraged exploitation and stymied trade. Defining the value of units was a prerogative of the nobility, which led to a profusion of weights and measures across the country’s provinces. It allowed local lords to take advantage of their subjects – weighing payments of grain in larger units than those used by peasants at the market, for example – prompting a common revolutionary demand for “one law, one weight, and one measure”.

READ MORE

The metric system was invented to overturn this and the other inequalities of the Ancien Régime, with new units derived from the latest scientific knowledge, and the meter itself defined as one ten-millionth of the distance from the North Pole to the equator.

This definition required a seven-year survey of France to calculate the length of the meter, and a much longer campaign to encourage citizens to use the thing. The work was not an immediate success, either, and France would not only become the first country to adopt the metric system, but the first to reject it, too. It would take decades of political accommodation for the meter and other metric units to be accepted in Europe, but as Napoleon Bonaparte presciently declared after the savants’ work was complete: “Conquests will come and go but this work will endure.”

Napoleon Bonaparte crossing the Alps by Jacques-Louis David. Napoleon said of the metric system: “Conquests will come and go but this work will endure.” Photograph: The Art Archive/Musée du Chateau de Versailles/Dagli Orti

Inch

If the meter shows how units of measurement can be designed from scratch to solve specific problems, the inch demonstrates the importance of historical lineage.

The inch is one of the oldest units in continual use in the English-speaking world, with the first written definitions stemming back to the Middle Ages. Around 1150, King David I of Scotland defined the unit as the width of the thumb “mesouret at the rut of the nayll.” Though, in order to compensate for the natural variety of the human body, this was given the addendum that the length should be taken as the average of the “thowmys of iii men, that is to say a mekill [large] man, and a man of messurabel statur, and of a lytell man.”

Two centuries later, King Edward II of England offered a new interpretation, declaring that “three grains of barley, dry and round, make an inch.” (Interestingly, the barleycorn was for many years its own unit of length, and survives today as the gradation of shoe sizes in English-speaking countries.)

Deriving units of length from the body and from nature in the way is the oldest form of measurement. These include units such as the ancient Egyptian cubit (the length from the elbow to the tip of the middle finger), the Roman passus (a pace of around 1.5 meters), and Arabic qirat (equal to the mass of a seed from the carob tree, from which we derive the modern unit of the 200-miligram carat, used to weigh precious gems). Such units are useful because they are always accessible, more or less consistent, and appropriate in scale. But over time, demands for precision and uniformity outweigh ease-of-use, and “natural” units must either be standardised or replaced.

Kilogram

The kilogram was created alongside the meter and so shares its political origins. In fact, it was derived from the meter and defined in 1795 as the mass of a single litre of water (the litre itself was the volume of a cube with sides 10 centimetres in length). It was an appropriately straightforward definition and meant that anyone with a meter bar in hand could define the kilogram and check its weight. In theory, anyway.

The savants soon ran into problems, though. Try to replicate this experiment yourself and you’ll discover that a whole range of factors affect the mass of a litre of water, from salinity to temperature to altitude. The definition was found to be so frustratingly approximate, in fact, that it was jettisoned, and a lump of metal forged to define the kilogram instead. (This happened twice: first in 1799 and again in 1889.) In other words, although it was more ideologically satisfying to have a definition of the kilogram anyone could replicate, it was more practical to define by fiat.

The story doesn’t end there, though, for even the most stable metal is liable to change, and in the 20th century scientists discovered that the kilogram was losing weight. In 2018, scientists voted to redefine the unit once more, basing its definition not on any physical matter, but on immaterial constants of nature (in this case: quantum calculations involving electromagnetic forces). Every unit of the metric system is now defined in this way, based on phenomena such as the spin of atoms and the speed of light. Such calculations can only be replicated in a few labs around the world, but they ensure that the units we use are unchanging – until we want to redefine them again.

Degree Celsius

Units such as weight and length are intuitively understood, but some phenomena, such as temperature, are so subjective they seem to resist accurate measurement. For most of human history, temperature has been measured only approximately. The ancient Greek physician Galen was one of the first thinkers to suggest there might be degrees of hot and cold in the second century AD, but he thought just four gradations would cover the necessary variations. By the 1500s, natural philosophers such as Galileo had designs early thermometers – glass tubes filled with liquid that rose and fell in response to changes in air pressure caused by heat – but such devices were still extremely imprecise.

By the 17th century, thermometers had improved but thermometry still faced challenges. For example, how do you check your thermometer is reliable, if you don’t have a reliable thermometer in the first place to compare it to? The solution from scientists was to seek out stable thermometric phenomena – events that always occurred at the same temperature and could be used to check a thermometer’s accuracy. A number of suggestions were put forth, from the melting point of butter to the heat of blood, freshly drawn. But after much experimentation, two reliable candidates emerged: the freezing and boiling points of water.

It was the Swedish astronomer Anders Celsius who applied this discovery most fruitfully in the 18th century, placing the two watery markers at either end of his thermometer’s scale and dividing the range between them by 100. As testament to the often arbitrary practices of measurement, though, Celsius’s original thermometer was actually backwards: he thought that water should freeze at 100°C and boil at 0°C.

Seconds

Why is a second a second long? Well, one oblique and fundamentally unverifiable explanation is: because you have 12 finger bones in each hand. Let me explain – backwards, if I may.

The second is currently defined as “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom.” To simplify a little, this means that scientists fire a laser at an atom of caesium-133, which flips back and forth between two energy states, giving off tiny electromagnetic pulses as it does like the ticking of the world’s tiniest clock. We count these pulses – all 9 billion-plus of them – and that constitutes a second.

If 9 billion-plus seems like an arbitrary number it’s only because we want our current definition to match the previous one, which measured the second as 1⁄31,556,925.9747 of the year 1900 (you have to choose a specific year because variations in the Earth’s orbit around the sun mean they differ slightly). That definition itself had been used to replace another fraction: 1/86,400 of a single day. And that, in turn, was used because the 24 hours of the day had been divided into 60 minutes and each minute into 60 seconds.

These base units of time get their names from these divisions: the minute from pars minuta prima, Latin for “the first very small part” and the second from pars minuta secunda, or “the second very small part.” And why 60? Because the ancient Babylonians, who made some of the world’s earliest and most accurate astronomical observations, used a sexagesimal (base 60) numeral system, instead of the decimal one (base 10) we’re used to today.

There’s one final question, though, and this is where we get very conjectural: why are there 24 hours in a day? Well, this is because the ancient Babylonians divided daylight into 12 parts, which were later doubled to cover the night, making a total of 24 hours. They did this because… we don’t really know why. One theory is that they were copying the 12 lunar cycles of the year; another is that the number was derived from their mathematical practice of finger-counting, in which you count by touching your thumb to the twelve bones of each finger (which then multiplies by the five digits to make 60).

So, there we have it. Run that backwards and you go from 12 finger bones to the length of the second. And it only took a minute.

James Vincent is the author of Beyond Measure: The Hidden History of Measurement, which was published by Faber on June 2nd.