10.5 Overall levels of scientific understanding
Hargreaves, I., Lewis, J. and Speers, T. ‘Towards a better map: Science, the public and the media’, Economic and Social Research Council.
Both surveys asked the same 13 questions in order to establish levels of public understanding of science or science policy. The questions were designed to test a basic (rather than advanced) knowledge of the science or research, with particular emphasis on exploring the policy and political developments in relation to scientific issues. It is worth noting, at this point, that we are not suggesting that these questions are all necessarily things people need to know to be able to make useful or worthwhile responses to these issues. Some facts, in this respect, are clearly more pivotal than others. We shall return to this in our Conclusion: at this point, we want simply to record what people knew, what they didn't, and, crucially, how these things might be responses to media coverage.
All the questions in the index were multiple choice, offering between two and five possible answers. The average number of correct responses remained remarkably constant – and fairly low – across the two surveys, decreasing marginally from 5.0 correct responses in April to 4.9 correct responses in October (around 38 percent in both surveys). On the basis of this, it is clear that overall levels of public understanding are fairly low, especially when one considers that the average score is not much higher than it would have been – given the multiple choice format- had people been guessing arbitrarily throughout. We should note, however, that the shape of responses do suggest a degree of understanding, albeit limited, in some areas.
What is, perhaps, more surprising, are the scores amongst those with more science education, or more education generally. While those with more education did better, the difference between those with more and less education was not as great as might have been expected.
So, for example, while those with no science education at all scored between 4.3 and 4.5, those with science degrees only scored just above 50 per cent (between 6.8 and 6.9).
In terms of age, those between 45–54 scored highest, with the under 25s and over 65s scoring lowest. There are some interesting exceptions to this, which we will look at later in relation to climate change.
Although television is the most popular source of information about science related stories (59 per cent say it is their main source of such information1), the amount of television people watch has little effect on knowledge. Heavier television users tend to score lower, although this group also tends to be less educated. The frequency with which people watch television news also has little discernible effect, with the exception of people who watch little or no news, who do tend to score lower.
Perhaps not surprisingly, broadsheet readers did better than tabloid readers, although again, differences are not that great. Most broadsheet readers averaged scores between five and six, tabloid readers between four and five (although readers of the Daily Star were well at the bottom of both surveys, averaging around 3.5 correct answers).
Given the fairly low levels of knowledge overall, it is not surprising that when we asked respondents (in the October survey) if there were occasions when they ‘felt confused about scientific issues’, most – 79 per cent – said yes. Interestingly, those who admitted feeling confused knew slightly more (4.9 to 4.7) than the 21 per cent who claimed they did not!
When it came to self-assessment in relation to three issues, people were a little more self-aware. In each case, those who felt well informed about the issues did better than those who felt ‘partly informed’, with those saying they felt ‘not very well’ informed scoring lowest.
Once again, however, the differences are not great, all of those groups claiming to be well informed scoring an average less than 50 per cent. This highlights the problems in testing knowledge purely through self-assessment (an increasingly common practice) revealing as it does the gap between claiming to be informed and actually being informed. While self-assessment might be useful in measuring people's self-confidence, such measure do not, on their own, tell us what or how much people know.
If most people are sometimes confused about scientific issues, who do they blame? Chiefly, according to half the people in our survey, the complexity of the scientific issues themselves. Yet how difficult is most of the basic science that makes it into the news? While science can, like any discipline, be very difficult, most of the questions in our public understanding index were at a fairly basic level. So, for example, the ‘greenhouse effect’, in which greenhouse gases accumulate to prevent heat from escaping from the Earth's atmosphere, is a fairly simple concept to understand. As an idea, it is not much more complex than understanding, say, the offside rule in football (let alone the difference between, say a 4–4-2 and a 3–5-1-1 formation). And yet most people (16 per cent in the April survey, 17 per cent in the October survey) were simply unaware of the workings of the greenhouse effect, assuming, instead, that greenhouse gases thinned the ozone layer (a response we shall look at in more detail when we look at the public understanding of climate change).
The second most nominated source of confusion was the media, and it is to the role of the media that we now turn.
1 This is followed by 22 per cent who nominate newspapers as their main source, seven per cent say radio, five per cent the internet, four per cent magazines, two per cent books and two per cent say their friends. Newspapers top the list of second favoured sources (45 per cent, followed by television (25 per cent), radio (11 per cent), friends (five per cent), the internet (five per cent), magazines (four per cent) and books (three per cent).