Skip to main content

About this free course

Download this course

Share this free course

Test kits for water analysis
Test kits for water analysis

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

2.3 Photometric tests

The colour of a solution and hence the amount of visible light it absorbs, is related to its concentration. This is expressed as the Beer-Lambert law (often shortened to Beer's law).

A = ε c l
Equation label: (3)

where c is the concentration of the solute, l is the pathlength of the cell, usually one centimetre, and ε is the molar absorptivity (or molar extinction coefficient), a measure of how strongly a chemical species absorbs light at a particular wavelength and in a particular solvent.

In the laboratory the absorbance of a coloured solution is measured using a UV-visible spectrometer or simpler fixed wavelength colourimeter. Video 3 demonstrates a spectrometer in use.

Download this video clip.Video player: Video 3
Copy this transcript to the clipboard
Print this transcript
Show transcript|Hide transcript
Video 3 UV-visible spectrophotometry.
Interactive feature not available in single page view (see it in standard view).

In the field, however, the need for convenience and portability means less complex devices called photometers are used. These miniaturised spectrometers tend to use light emitting diodes (LEDs) as an excitation source. LEDs have many advantages: they emit narrow band radiation compared with a tungsten light source and because they are more efficient at producing visible light their energy consumption for a given light intensity is much lower. This makes them ideal for a battery-powered field instrument. Photometers also contain interference filters for wavelength selection and detection of light intensity is usually by a photodiode. They use a microprocessor to produce an instant concentration output displaying mg l1.

Portable photometers are pre-programmed with test methods and the sample cell is often barcoded so the correct method is selected. Calibration is carried out by the manufacturers who also supply traceable reference standards for on-site checking. Photometers also correct for turbidity (discussed in detail in Section 4.1).

Photometers are available that measure multiple analytes or just one. The sample is contained within an optical grade glass cell, and a typical procedure involves adding measured doses of reagent to the sample, using a dose-metered dropper, a pipette or dissolving tablets. The solution is then left for a prescribed period for the colour to develop, and is then inserted directly into the photometer to obtain a reading - standard practice for colourimetric measurement.

There is a linear relationship between the colour of the solution (absorbance) and the concentration of analyte but this isn't necessarily the case over the whole concentration range you might wish to measure (Figure 5). A 'flattening out' is common at high concentration, and you must make sure your unknown value falls within the linear range on the curve. Modern photometers have this range electronically stored.

If the concentration of a particular analyte is outside the linear range of the test method, dilution is necessary, and, as we discussed earlier, your measured concentration value must be multiplied by the appropriate dilution factor.

This is where test strips come into their own. They can be used as a preliminary test, i.e. an indication of concentration of your analyte to tell you if it is within the working range of your instrument or in the non-linear region.

Figure 5  The working range of a photometer.

Question 7

We have shown the graph passing through the origin in Figure 5. In reality such a graph becomes non-linear at low concentration values, why is this?


As concentration is progressively lowered you will reach a point known as the limit of quantification, which is the lowest concentration of analyte that can be determined with an acceptable level of uncertainty; this defines the lower limit of the linear range. This is to be distinguished from the limit of detection which is the lowest concentration of analyte that can be detected but not necessarily quantified.