# 1.4 Decimal places

If you have less than one unit you should put a zero before the decimal point to make it easier for yourself and others to read the value (e.g. you should write 0.4 rather than just .4, as will be explained later in this course). However, how many zeros should you put *after* the last whole number in the series? For instance, is 0.4 the same as 0.40?

The short answer is that on one level, it is. However, by writing 0.40 we are saying that there are four tenths and zero hundredths, and importantly we are saying that we can actually measure to an accuracy of an individual hundredth of a unit; in other words to two **decimal places**. In contrast, by writing 0.4 we are only claiming an accuracy to the level of individual tenths of a unit, or to one decimal place.

One way of getting a more accurate measurement is to use an instrument with a more finely divided scale.

Figure 3 shows a close-up of two thermometers, labelled A and B, that were placed side by side to record the air temperature in a room.

In terms of accuracy the scale on thermometer A is quite coarse, as the markings represent individual degrees Celsius (°C). Using this scale, we can see that the room temperature is somewhere between 21 °C and 22 °C. On closer inspection, someone might estimate it at 21.7 °C, but someone else could easily record it as 21.6 °C or 21.8 °C. There is some uncertainty in the first decimal place, and there is certainly no way we could accurately state the temperature to two decimal places using this thermometer.

In order to give someone an idea of how confident we are about the measurements we make with thermometer A we should quote the **range** of possible values (i.e. the highest and lowest values) that the actual temperature could be. Since we estimate the reading to be between 21.6 °C and 21.8 °C we would choose the mid-point of these values, and say that the temperature was within 0.1 °C of 21.7 °C. As we will see later, another way to write this would be 21.7 °C 'plus or minus' 0.1 °C, or 21.7 ± 0.1 °C. This gives us a measure of the **uncertainty** of thermometer A.

Now look at thermometer B. This thermometer has a finer scale, with divisions marked every 0.1 °C. Now we can clearly see that the room temperature is between 21.6 °C and 21.7 °C. This is within the range of possible values that we estimated from thermometer A, but the finer scale of thermometer B allows us to be more certain of the temperature. Nevertheless, even though thermometer B allows us to read the temperature to within 0.1 °C we cannot be so sure about the second decimal place; someone might read it as 21.63 °C, whilst another person might read it as 21.61 °C or 21.65 °C. With this scale, we can be sure of the first decimal place, but not the second.