Eight years on from when Tim Harford first became the presenter of BBC statistics programme More or Less, the Open University's Professor Kevin McConway revisits their first interview and looks back over how the series has changed during his tenure as academic consultant for the show.
- Find out more about More or Less on Radio 4
- Investigate free courses and articles about maths and statistics
- Learn how to read and interpret social data with our Postcode Patterns interactive
1) What's changed on More or Less over the years? Are there less statistical mistakes in the news these days?
Transcript: "I hadn't noticed that, but you're probably right. And if that's true, why is it true?"
KEVIN MCCONWAY: Thanks for talking to me Tim Harford. It’s great to have been working with you on More or Less ever since you started. I think that was eight years ago now. And I’d like to start off by perhaps reviewing what’s changed in the world of numbers since you began. One thing that occurred to me is I don’t think you’re picking up as many journalistic statistical mistakes as often as you did, would you agree?
TIM HARFORD: It’s interesting you say that, Kevin. I hadn’t noticed that. But you’re probably right. And if that’s true, why is it true? Why are we not picking up so many journalistic errors? I certainly don’t think it’s because journalists are making fewer errors. I think many journalists are under a lot of pressure, more pressure than they ever were before.
They’re trying to get copy out on deadline. They’re more likely to simply push out press releases without subjecting them to scrutiny. And so while the best journalism I think is probably better than ever, because journalists have access to more resources than ever, I think the average standard of journalism isn’t any better at all. So, why, assuming you’re right, why have we moved away from just taking down stories in The Daily Mail and The Sun and The Telegraph and The Guardian? And I think the answer is probably because we’ve been doing this for a long time, and we like a little bit of variety so we chase down other prey so to speak.
I think also that there are, you know, we want to get to the bottom of what’s going on. So it’s all very well to say a newspaper has published a story and it’s not true and they shouldn’t have published it. What we really want to say is, well, what is true? And so I think that’s probably where our attention has gone. We’re focusing on what can we say about, for example this week we’re looking at dementia. We can take down newspaper reporting of dementia, but what we really want to know is: well, what can we say that stands up about the problem of dementia, and how do we know it?
KEVIN MCCONWAY: So, when you’re doing that, is part of the aim of that perhaps to help people when they’re reading the papers or watching the broadcast news or something, to decide which is the really good stuff, which I’d agree with you is coming out now, and which is the trash?
TIM HARFORD: Well we want a variety in any episode of More or Less. We want some lighter stories, some more serious stories. We want quite technical stuff and very simple stuff. So I don’t think there’s a consistent aim in everything we do. We are a magazine format. But we certainly believe that a lot of the statistical takedowns that we do, you could do yourself at home without any particular statistical training. It might take a Google search or two, often not even that. Just a little bit of common sense and ask yourself some basic questions such as what’s the source of this number? Is this number big? Should I be impressed by it even if it is true? What does it really mean? I’ve been given some kind of definition but I don’t really know what the number applies to.
So we are trying to empower people. But also sometimes things are technical and you can’t do it at home. And we can talk to experts, and experts can explain things to us. And so why not convey those expert views as well?
KEVIN MCCONWAY: Yes, I think that makes a great mix. Just on the point about what people can do at home themselves, a lot of the things you said are surely the same kind of questions that people should ask themselves about any news story, even if it isn’t explicitly about numbers. Would you agree?
TIM HARFORD: Well I think you always want to be sceptical about any story, have a healthy scepticism. Who wants me to believe this, why do they want me to believe it? But I think that journalists are more sceptical about certain kinds of stories, and are better able to challenge certain kinds of stories than others.
So for example if friends of a cabinet minister whisper in some political journalist’s ear, well that journalist probably has the good sense to check around, to get corroboration, to put those claims in context. But if a charity comes out with a report saying that the condition that the charity is raising money for is getting worse, a lot of journalists are helpless or feel helpless in front of those numerical claims. And they will often just say well it’s a charity, it’s numbers, let’s publish them.
So, yes, we should be sceptical about everything, but probably the scepticism is more needed in statistical stories because journalists often feel less qualified to apply that scepticism themselves.
KEVIN MCCONWAY: That’s interesting. So maybe that means there’s an even bigger reason for empowering ordinary listeners and so on, which more or less does as you’ve described.
2) Has there been any change in statistics being used to inform policy?
Transcript: "I think we've taken some steps in the right direction, but they're quite small steps..."
KEVIN MCCONWAY: Just in terms of the use of numerical evidence, numerical stories and things by others, do you think we’ve moved on as a nation, as a society, in terms of evidence-based policy? I mean a lot has been said over the past few years about the importance of evidence in, you know, determining government policy, determining the policy of all sorts of bodies, but do you think anything’s really changed?
TIM HARFORD: I think we have taken some steps in the right direction, but they’re quite small steps. So, for example, there is more emphasis on the use of randomised trials to inform social policy. The Prime Minister famously set up what has been called the Nudge Unit, the behavioural insight team, which has since been spun off and does consulting for various governments.
While it was often described as being a behavioural economics unit applying ideas from psychology to design public policy better, what the Nudge Unit really did that I think was interesting was run randomised trials. And it has popularised the idea of randomised trials with ministers who now say well you can run a randomised trial fairly cheaply, it doesn’t have to be this great big sophisticated thing, doesn’t have to cost a lot of money, you can get results quite quickly.
So that I think is something that is useful. And of course randomised trials are not the only way to generate evidence, but they are a good robust way to generate evidence that people understand quite straightforwardly. Oh well there were two groups, and some of them got one policy and some of them got another policy, and we can compare the two groups and it makes sense.
KEVIN MCCONWAY: Yes, it’s not always as easy as that but I think you’re right, people do understand them. So it’s good that things are moving on.
3) Have attitudes changed towards statistics as a subject?
Transcript: "Do people think statistics are more interesting? I think they possibly do..."
KEVIN MCCONWAY: Do you think there’s been any influence of that [the popularisation of random trials] or of anything else on people’s attitude to statistics? I remember thinking back a long time to when I first became a statistician, there would be this image that it was terribly boring. You’d mentioned it at a party and everyone would kind of slink away. But maybe I’m more personable than I was, but I think people’s views have changed. Do you agree, do you think people think it’s less boring than once they did?
TIM HARFORD: It is entirely possible that you have become more charming Kevin, difficult as it is to believe that you could improve on perfection. But drawing back from this sample of one, your own dinner party experience, do people think statistics are more interesting? I think they possibly do.
Certainly if I look at my own field, which is economics, within that field empirical work is now much more popular. It used to be all about theoretical results, prove existence theorems. Now the action is all about gathering data, and testing that data, examining that data and seeing what you can prove about the real world. For economics probably the most successful social science book to be published in the last 20 years is about statistics basically. It’s written by an economist and a journalist but they’re basically doing applied statistics. So I think that that has helped.
What hasn’t helped though is the fact that it is now easier than ever for people to just hop on the internet and grab any old nonsense number they see stripped of any context without really understanding what they’re looking at, and insert that number in to whatever argument they want to make: politicians do it, corporations do it, non-profits do it, people do it in arguments down the pub. And so many of those numbers are meaningless, that I think the problem is not so much that statistics are boring, but that people feel that statistics can’t be relied upon, and I think that’s something that really worries me.
KEVIN MCCONWAY: Yes, because there’s a sense in which one might hope they’d be more reliable than just some sort of description in words, some sort of thing which is clearly an opinion. But this is kind of an old position isn’t it in a way. You know, if you go right back to the 19th century when the Statistical Society became the Royal Statistical Society, it was founded, there was this split that they put over between the statistical facts that statisticians were responsible for, and the interpretation of these and the use of them to make policy that economists, political economy people were responsible for, and that these two things kind of shouldn’t meet. So this as a discourse has been going on for a long time.
TIM HARFORD: Yes, I mean I understand the idea that statisticians should be, when they’re producing official statistics, they should stand above the fray and they should be independent minded and so on. Of course often being involved with gathering the statistics means that you have a really good sense of what those statistics are and aren’t telling us. So I’m not sure I would draw such a bright line between producing the statistics and developing policy ideas.
Something that does worry me, Kevin, is the fact that so many statistical claims are now made which I think are neither true nor false. They’re just empty claims. They’re claims that don’t mean what you think they mean, or they’re claims that are true but unhelpful or true but misleading. We saw that a lot during the general election campaign, where More or Less were among a number of groups fact checking that campaign. So often you would hear a politician make a claim, and then you’d go on The Today programme or you’d go on PM in the evening on Radio 4, and the presenter would say well Tim, is that claim true? And what you would hope to be able to say is yes that’s true, or no that’s not true.
But very often you got into a situation where you said um well it’s complicated, it all depends on what you mean by blah, blah, blah. And at that point you’ve lost, and the person who came up with this unhelpful statistic has won. They’ve kind of dragged the fact checker into the mud with them, you sound like it’s all very confusing and complicated and too boring to understand, and you haven’t been able to demonstrate that they’ve been lying. Because they haven’t quite been lying; they’ve been misleading people by other ways.
So I’m not quite sure exactly what the right term is for these statistics that are neither true nor false. I need to come up with one. I can think of a word but I’m not sure I should say it while the microphone is taping me. But that’s a worry, that’s a big worry. Because it just generates the idea that oh, you know, all of this, all of these numbers, they’re just hot air, they don’t mean anything, and it’s all too complicated to understand. And I think that’s very harmful for public policy, and it’s something that we statistically minded people should resist.
KEVIN MCCONWAY: Yes, which on More or Less you certainly do try to do. Because, I’m sure you’re right, this is kind of contamination from that kind of misuse of statistics to anything numerical. You know, if they can do that on this one that even Tim Harford couldn’t make sound good, they can do it on anything. And that’s not true, but it’s a difficult idea to get across.
4) Has big data changed the world?
Transcript: "Well I think there is a lot of interesting citizen data science out there..."
KEVIN MCCONWAY: Just turning on to a different aspect of this, going back a few years with the rise of big data and big important public data sources, and those data sources being made available to the public, there was a kind of big hope that if governments, you know, perhaps commercial companies, whoever, put the data out there, then citizen data analysts would come to grips with it and find all sorts of interesting things that nobody had seen, would audit it so the government wouldn’t have to worry so much about the auditing and so on. And that has happened a bit but it hasn’t happened that much. Would you agree, and why do you think that is if you do?
TIM HARFORD: Well I think there is a lot of citizen data science out there, a lot of people getting interesting datasets and doing interesting things with them. Maybe it hasn’t changed the world. And I think that’s partly because data are not always very easy to handle. You can draw very pretty pictures and beautiful data visualisations, but they don’t necessarily actually tell the truth about the world without a lot of training. Another issue of course is the anonymization of datasets.
So, when you say open data, well we could be talking about a lot of things, but often we’re talking about releasing data that describes people. And you might have very sensitive information about those people: where they live, where they spend their nights, whether they regularly spend their nights at houses other than the houses that they should be spending their nights. What their internet browsing habits are, whether they have any medical conditions, you have all this information potentially about people. And intuitively you say well that’s great, we just take Kevin McConway’s name off that data and then we can put his internet browsing habits out there and that’ll be no problem. But it turns out it’s very hard really to effectively anonymise data. You can go back, compare that data with other publicly available data and suddenly you find you’ve pinned it down and it is Kevin McConway after all who’s looking for that very strange stuff on the internet, and we’ve got you. And I think concerns about that sort of thing have slowed down certain aspects of open data, and perhaps rightly.
I seem to remember there was a case in the States, possibly Massachusetts, where the governor, I may have misremember the details but it’s broadly that the governor of Massachusetts allowed, you know, lots of medical records were going to be released because this was all about open data and that’s great, and don’t worry they were going to be anonymised. And very shortly a data researcher figured out whose medical records were actually the governor’s, and mailed a copy of the governor’s own confidential medical record to himself. It didn’t have the governor’s name on but they’d managed to identify well this guy is the same age as the governor, has the same number of kids as the governor, lives on the same street as the governor, it’s probably the governor.
So that I think has slowed down the open data movement. And these are serious things we need to wrestle with. But I wouldn’t write it off either. So much of what we do now on More or Less involves our friendly loyal listeners getting data, analysing the data, sending us the results and writing all kinds of interesting things. They might blog about it; they might just email it to us. So there’s lots of good stuff going on.
KEVIN MCCONWAY: Yes, that sounds excellent.
TIM HARFORD: Compare ourselves for example to Ronald Fisher doing his early analysis of agricultural experiments nearly 100 years ago now at Roth Hampstead just outside London. He had this mechanical calculator. It could take weeks and weeks of work to tabulate a single table. And now anybody can do it with a computer you can have just a couple of hundred quid. So we’ve just got far more ability to crunch numbers, far more access to the numbers, and that definitely makes a difference. That empowers people who know a little bit about how to deal with numbers.
KEVIN MCCONWAY: And linking that back to some things you were saying before, there’s still a necessity to understand what all this machinery is telling you, but I think you’ve said we’ve moved on a lot with that as well, both in terms of expert analysis and what people on the street can do. So I think that’s excellent.
5) Is it cooler to be a nerd now?
Transcript: "Oh definitely, yeah for sure. We're taking over the world..."
KEVIN MCCONWAY: Is it cooler to be a nerd than it was?
TIM HARFORD: Oh definitely, yeah for sure. We’re taking over the world. Think about Bill Gates now donating multibillion dollar fortunes to save the poor. You’ve got The Big Bang Theory is, you know, popular, TV comedy about nerds. Nerd culture I think is in. We’re having our revenge against the jocks, so that’s good I think all round.
KEVIN MCCONWAY: Yes, it sounds excellent. Do you get nerdy fan mail?
TIM HARFORD: Yes, I get lovely nerdy fan mail actually. So somebody just this morning, I arrived at the office and somebody had sent me a poster describing particular economic theory with a nice fan letter. So I can put this nerdy economics poster on my wall and show the letter to my wife and try and make her jealous, but I think she’s reasonably calm about such things these days. The groupie lifestyle is much exaggerated; it’s not quite Led Zeppelin I have to say.
6) Is data science the future? Or just hype?
Transcript: "Well data science just seems to be a slightly different word for statistics..."
KEVIN MCCONWAY: Data science seems to be the word on many nerds’ lips. What’s that all about then? I mean some of my statistical colleagues say it’s a load of hype. Others say no it’s the future, it’s the future we’re moving into and, you know, it’s connected with open data that you were talking about and so on. What do you think?
TIM HARFORD: Well data science just seems to be a slightly different word for statistics. But I imagine that it involves also the handling of very large datasets, the programming of computers, the best practice for using these datasets. Because I mean these are potentially very large datasets, indeed, and they might need techniques that are different from classical statistics. But I think generally when people say data science they’re basically just trying to come up with a cooler way.
It’s like statistics only you can become a billionaire and set up the next Facebook or something. So I think that’s what’s really going on. A little bit of hype but data are powerful, and Hal Varian who wrote the textbooks that taught me economics and then went on to become Chief Economist of Google, he says that statistics are the future. This is the career of the future, whether you call it statistician or call it data scientist I think probably doesn’t matter much.
KEVIN MCCONWAY: Right, so there’s an aspect of hype but there’s substance behind it, sounds good to me.
7) Is statistics a good career to be getting into? Why study statistics?
Transcript: "It possibly is... But I think the main reason is that it's fun, it's fascinating..."
KEVIN MCCONWAY: Some of the people who are going to be listening to this would be Open University students who are studying statistics or economics or various subjects like that. And maybe they’re thinking, they’re doing that because they’re thinking of getting into it as a career. Is now the time to get into those things as a career do you think?
TIM HARFORD: It possibly is. I think one would always be a little bit cautious about making career recommendations for two reasons. One is you never know how the world is going to change. I’m reminded of that scene from The Graduate where Benjamin is told that the future is plastics. You know, I’m not sure what that means that you should study, and is the future still plastics or is the future now something else? But I think the other thing is there’s no point in doing a subject, in studying something just because you speculate that it might offer you some career possibilities down the track.
Ultimately, if you’re not engaged by it, if you’re not enthused by it, you shouldn’t be doing it. I mean that’s not to say one should entirely ignore career prospects, because you can end up studying some things that let’s say there’s not much of a market for. But I think the main reason to study statistics is that it’s fun, it’s fascinating, it equips you to think well about the world. And the idea that maybe Hal Varian and his friends at Google are going to give you a job should be a secondary consideration; if not something that you entirely want to dismiss out of hand.
KEVIN MCCONWAY: So, a balance for you as always. Well Tim Harford, thank you very much.
TIM HARFORD: Thank you.