What is temperature and why is it relative to humidity?
December 8, 2015
I’ve worked at TVC for a few years now and it never ceases to amaze me what our equipment is used for. Our ALXII unit is capable of measuring everything from volts and temperature, to gas and wire speed. I’ve always been interested in science and technology and since I’m studying a design degree, it seemed like the perfect opportunity to extend my working knowledge and understanding of these key features…plus my director challenged me to do it after getting a rather good grade on my last assignment! This series of blogs is going to look at key figures in the history of these discoveries and the why and how of us using them.
This particular piece was going to be in temperature and I was looking at the Celsius and Fahrenheit scales, however, after beginning my research, up popped Lord Kelvin. It’s also hard to discuss temperature without referencing humidity! Therefore, this piece is going to cover both subjects and some of the key figures who explored their properties.
Heat can be transferred by conduction, convection or radiation and temperature is a numerical representation of the potential for that transfer. If there are two objects and one is hotter than the other, the heat will flow from the hotter to the colder and it is the difference in the temperature which causes the heat to flow. Once it has stopped and the temperature levels off, it has reached a state of thermal equilibrium and it enables us to measure the temperature. Therefore, heat and temperature are related but not the same. Atomic or molecular motion causes heat energy and the more energetic they are, the more heat they give off. In a gas or liquid, they move around quickly whereas in a solid they will vibrate. We measure the amount of thermal energy (heat) in joules (J) and the second law of thermodynamics allowed us to create a reasoned temperature scale. So, for example, if you have a mercury thermometer, the properties of (absolute) temperature given to us by physics and chemistry, would not be affected by the properties of mercury itself.
In our modern world, we invariably use electronic devices to measure temperature now instead of mercury filled ones and, interestingly, recent experiments using ultra-cold atoms have measured temperatures which are negative on the absolute temperature scale.
Humidity, however, is the presence of water vapour in gas and most people are familiar with it in the air; the higher the humidity, the more water vapour there is. If the air is a particular temperature and contains the maximum amount of water it can contain at that temperature, it is said to be saturated, however, the amount is usually less than what would be required for saturation. It can be difficult to measure humidity as water vapour is everywhere! Measurements can be made with a variety of techniques and recorded in many different ways! Water vapour condensation causes clouds, rain, dew, frost and fog, although it is possible for this vapour to affect materials without condensation. Water’s reaction to surfaces is affected by temperature hence it is of use to measure their relative humidity; the properties of water are dependent upon the temperature and pressure of where it is found. If the temperature is high, it is likely that the air (gas) will contain more water vapour.
Humidity measurement is integral to modern society. Our weather reports, preventing spoilage of degradable items, our air conditioning systems – it is also integral to our climate as it helps to warm the earth, without it we would be a fair few degrees colder!
Overall, our discovery of temperature and humidity began slowly; much delayed by religious doctrine in Europe and only picking up pace from the Renaissance and the ‘Age of Enlightenment’ onwards. During the Iron Age, we began to create compounds and use metals which required a degree of control when it came to temperature and there are records showing the Indian culture in 3000 BC documenting clouds and the Chinese weighing charcoal exposed to the atmosphere to detect the increase in weight. Greece was the first known to record rainfall in 500 BC and in 170 AD, a Greek scientist by the name of Galen, recorded his findings when mixing boiling water and ice. He wrote of a ‘neutral temperature’ which he had created with equal parts of the hot and cold water, but it wasn’t until the years from 1400 AD that we began to systematically test and understand our environment.
In 1450, Italian artist and great thinker, Leon Battista Alberti, described a flat plate device to measure wind speed, an anemometer and in around 1460, the German theologian and astronomer, Nicolaus de Cusa, invented the first hygrometer by placing wool on top of stone, the wool absorbed the moisture in the air and he noted his observations.
By 1593, scientists, Galileo Galilei among them, were creating thermoscopes. These could show there was a difference in the temperature but could not show the exact temperature. Around 1640, the great René Descartes, who developed analytical geometry and much of calculus, discovered that water vapour was a distinctive substance in air and in 1643, Evangelista Torricelli invented the barometer. Later, a numerical scale was added to the thermoscope by Santorio Santorio. His version was to take the temperature of humans but it wasn’t until 1654 that an enclosed, liquid-filled thermometer was created by Ferdinand II using alcohol. Incidentally, Ferdinand II also created a condensation hygrometer earlier that decade! Still, these were not accurate and invention continued. Danish astronomer Olaus Roemer was working on his own version of an alcohol-filled thermometer, marking the boiling and melting points of water on a scale of his own creation.
Enter Daniel Gabriel Fahrenheit. After meeting Roemer, Fahrenheit decided to refine the thermometer the astronomer had created. In 1714, Fahrenheit gave us the modern thermometer. He used mercury instead of alcohol, giving much more accurate measurements as it would expand and contract as the temperature increased and decreased. The use of mercury has arisen due to a new method of cleaning the mercury which allowed it to rise and fall without sticking to the sides of the glass. Mercury also expanded at a more constant rate than alcohol and could be used across a much broader temperature range. During his investigations, he also discovered that the boiling point of water changes depending on the atmospheric pressure and how to supercool water (water being cooled to below freezing point but it does not become ice). In 1724, he announced his scale; by using water, ice, ammonium chloride and salt, he set the zero temperature and the scale was divided into the boiling and freezing points of water.
Fahrenheit also created a constant-weight hydrometer which measures the gravity and strength of a liquid and also a thermobarometer which estimated the pressure of the atmosphere (barometric pressure) using the boiling point of water.
Soon after Fahrenheit announced his scale, another astronomer from Sweden, Anders Celsius announced his temperature scale which divided boiling and freezing points into 100 degrees; the scale is often referred to as centigrade due to the divisibility. Celsius’ original scale initially had zero as the boiling point of water but he soon reversed it.
It wasn’t until 1799 that humidity, as we understand it, was first measured accurately by John Leslie, using a dry- and wet-bulb differential thermometer although the formula for calculating humidity by this method is credited to Joseph Louis Gay-Lussac in 1815.
In 1848, Britain’s William Thompson announced his temperature scale. As he later became known as Lord Kelvin, the scale was given the name of the Kelvin scale, and was based on the theoretical temperature of absolute zero, the temperature at which all materials have no heat energy. Zero (0) K is the coldest point on the scale and there are no negative temperatures. Since the 1950s, the scale is the preferred temperature gauge for scientific applications and globally in scientific applications, excluding the USA who have remained using Fahrenheit.
The dewpoint meter for humidity came in 1854, created by H.V. Regnault and in 1887 the first aspirated psychrometer was invented by R. Assmann. This incorporated the suction of a fan for ventilation when using a wet- and dry-bulb device. There are a variety of ways to express humidity but there is no official SI unit. We can measure humidity in a percentage, or relative humidity, using the unit ‘%RH’. This measurement records the degree of saturation of water vapour in the air and then compares it to what we would normally expect at that particular temperature. This is the most commonly used. We can also measure the dew point – an expression of temperature at which dew (condensation) would occur if the air (gas) is cooled – which is useful as we can store gases at temperatures where we know that no condensation will form. It is also useful because it is an absolute measurement, unlike relative humidity. Humidity can also be expressed as a fraction or ratio. Whichever measurement choice is made for humidity, it is important to take air temperature measurements at the same time as the relative part to relative humidity refers to its relativity to temperature.