Digital thinking tools for better decision making
Digital thinking tools for better decision making

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

Free course

Digital thinking tools for better decision making

2 Beware of the bubble!

Not only does Google look for websites that match the search request you enter, it also looks for sites that match you. Based on your search history, location and any other data it holds about you, it tailors which results to display, so that eventually you might end up trapped in your own private ‘bubble’, which limits your horizons.

Similarly, Facebook’s ‘News Feed’ shows you ‘stories that … are influenced by your connections and activity on Facebook’.

Activity 3 The filter bubble

Timing: Allow 8 minutes.

Watch this TED talk on the ‘filter bubble’ by Eli Pariser.

Download this video clip.Video player: Video 1
Skip transcript: Video 1

Transcript: Video 1

[DING]
[WATER DROPS]
[MUSIC PLAYING]
[APPLAUSE]
ELI PARISER
Mark Zuckerberg-- a journalist was asking him a question about the News Feed. And the journalist was asking him, you know, why is this so important? And Zuckerberg said, "A squirrel dying in your front yard may be more relevant to your interest right now than people dying in Africa." And I want to talk about what a web based on that idea of relevance might look like.
So when I was growing up in a really rural area in Maine, you know, the internet meant something very different to me. It meant a connection to the world. It meant something that would connect us all together. And I was sure that it was going to be great for democracy and for our society.
But there's this kind of shift in how information is flowing online. And it's invisible. And if we don't pay attention to it, it could be a real problem. So I first noticed this in a place I'd spent a lot of time-- my Facebook page. I'm progressive politically-- big surprise. But I've always, you know, gone out of my way to meet conservatives. I like hearing what they're thinking about. I like seeing what they link to. I like learning a thing or two.
And so I was kind of surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out what was going on was that Facebook was looking at which links I clicked on. And it was noticing that actually I was clicking more on my liberal friends' links than on my conservative friends' links.
And without consulting me about it, it had edited them out. They disappeared. So Facebook isn't the only place that's doing this kind of invisible algorithmic editing of the web. Google's doing it, too. If I search for something and you search for something, even right now at the very same time, we may get very different search results. Even if you're logged out, one engineer told me, there are 57 signals that Google looks at-- everything from what kind of computer you're on to what kind of browser you're using to where you're located that it uses to personally tailor your query results.
Think about it for a second. There is no standard Google anymore. And you know, the funny thing about this is that it's hard to see. You can't see how different your search results are from anyone else's. But a couple of weeks ago, I asked a bunch of friends to Google Egypt and to send me screenshots of what they got.
So here's my friend Scott's screenshot. And here's my friend Daniel's screenshot. When you put them side by side, you don't even have to read the links to see how different these two pages are. But when you do read the links, it's really quite remarkable.
Daniel didn't get anything about the protests in Egypt at all in his first page of Google results. Scott's results were full of them. And this was the big story of the day at that time. That's how different these results are becoming. So it's not just Google and Facebook either. You know, this is something that's sweeping the web. There are a whole host of companies that are doing this kind of personalization.
Yahoo News-- the biggest news site on the internet is now personalised. Different people get different things-- Huffington Post, The Washington Post, New York Times all flirting with personalization in various ways. And where this-- this moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.
As Eric Schmidt said, "It will be very hard for people to watch or consume something that has not, in some sense, been tailored for them." So I do think this is a problem. And I think if you take all of these philtres together, if you take all of these algorithms, you get what I call a philtre bubble.
And your philtre bubble is kind of your own personal unique universe of information that you live in online. And what's in your philtre bubble depends on who you are. And it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out.
So one of the problems with the philtre bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues. And they noticed something kind of funny that a lot of us probably have noticed, which is there's some movies that just sort of zip right up and out to our houses. They enter the queue. They just zip right out.
So Iron Man zips right out, right? And Waiting for Superman can wait for a really long time. What they discovered was that in our Netflix queues, there's kind of this epic struggle going on between our future aspirational selves and our more impulsive present selves.
You know, we all want to be someone who has watched Rashomon.
[LAUGHTER]
But right now, we want to watch Ace Ventura for the fourth time.
[LAUGHTER]
So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables. It gives us some information dessert. And the challenge with these kind of algorithmic philtres, these personalised philtres, is that because they're mainly looking at what you click on first-- you know, you don't-- it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.
So what this suggests is actually that we may have the story about the internet wrong. In a broadcast society-- you know, this is how the founding mythology goes, right? In a broadcast society, there were these gatekeepers, the editors. And they controlled the flows of information.
And along came the internet. And it swept them out of the way. And it allowed us-- all of us to connect together. And it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones.
And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, that we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important. This is what TED does, right-- other points of view.
And the thing is we've actually kind of been here before as a society. In 1915, it's not like newspapers were sweating a lot about their civic responsibilities. Then, people kind of noticed that they were doing something really important, that, in fact, you couldn't have a functioning democracy if citizens didn't get a good flow of information, that the newspapers were critical, because they were acting as the philtre, and that journalistic ethics developed. It wasn't perfect, but it got us through the last century.
And so now, we're kind of back in 1915 on the web. And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing. You know, I know there are a lot of people here from Facebook and from Google, Larry and Sergey, who, you know, people, who have helped build the web as it is. And I'm grateful for that.
But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our philtres.
And we need you to give us some control so that we can decide what gets through and what doesn't, because I think we really need the internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a web of one. Thank you.
[APPLAUSE]
[CHEERING]
Thank you.
[MUSIC PLAYING]
End transcript: Video 1
Video 1
Interactive feature not available in single page view (see it in standard view).

Many people see filter bubbles as a threat to democracy. This is because they may result in us only sharing ideas with like-minded people, so that we are not exposed to differing points of view. We simply end up having our existing opinions and beliefs continually reinforced.

In his farewell address [Tip: hold Ctrl and click a link to open it in a new tab. (Hide tip)] , President Barack Obama spoke of the ‘retreat into our own bubbles, ... especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions ...’.

Activity 4 Reflecting

Timing: Allow about 2 minutes

Having watched Eli Pariser’s talk, take a few moments to consider whether you are in a filter bubble.

Do you think you are getting a balanced information diet?

DTT_1

Take your learning further

Making the decision to study can be a big step, which is why you'll want a trusted University. The Open University has 50 years’ experience delivering flexible learning and 170,000 students are studying with us right now. Take a look at all Open University courses.

If you are new to University-level study, we offer two introductory routes to our qualifications. You could either choose to start with an Access module, or a module which allows you to count your previous learning towards an Open University qualification. Read our guide on Where to take your learning next for more information.

Not ready for formal University study? Then browse over 1000 free courses on OpenLearn and sign up to our newsletter to hear about new free courses as they are released.

Every year, thousands of students decide to study with The Open University. With over 120 qualifications, we’ve got the right course for you.

Request an Open University prospectus371