Beware online filter bubbles Eli Pariser

Mark Zuckerberg a journalist was asking

him a question about the newsfeed and

the journalist was asking him you know

why is this so important and Zuckerberg

said a squirrel dying in your front yard

may be more relevant to your interests

right now than people dying in Africa

and I want to talk about what a web

based on that idea of relevance might

look like so when I was growing up in a

in a really rural area in Maine you know

the internet meant something very

different to me it meant a connection to

the world it meant something that would

connect us all together and I was sure

that it was going to be great for

democracy and for our society but

there’s this kind of shift in how

information is flowing online and it’s

invisible and if we don’t pay attention

to it it could be a real problem so I

first noticed this in a place I spend a

lot of time my Facebook page

I’m progressive politically big

surprised but I’ve always you know gone

out of my way to meet conservatives I

like hearing what they’re thinking about

I like seeing what they link to I like

learning a thing or two and so I was

kind of surprised when I noticed one day

that the Conservatives had disappeared

from my Facebook feed and what it turned

out was going on was that Facebook was

looking at which links I clicked on and

it was noticing that actually I was

clicking more on my liberal friends

links than on my conservative friends

links and without consulting me about it

it had edited them out they disappeared

so Facebook isn’t the only place that’s

doing this kind of invisible algorithmic

editing of the web Google’s doing it too

if I search for something and you search

for something even right now at the very

same time we may get very different

search results even if you’re logged out

one engineer told me there are 57

signals that Google looks at everything

from what kind of computer you’re on to

what kind of browser you’re using

to where you’re located that uses to

personally tailor your query results

think about it for a second there is no

standard Google anymore and you know the

funny thing about this is that it’s hard

to see you can’t see how different your

search results are from anyone else’s

but a couple of weeks ago I asked a

bunch of friends to Google Egypt and to

send me screenshots of what they got so

here’s my friend Scott’s screenshot and

here’s my friend Daniel screenshot when

you put them side-by-side you don’t even

have to read the links to see how

different these two pages are but when

you do read the links it’s really quite

remarkable Daniel didn’t get anything

about the protests in Egypt at all in

his first page of Google results

Scott’s results were full of them and

this was the big story of the day at

that time

that’s how different these results are

becoming so it’s not just Google on

Facebook either you know this is

something that’s sweeping the web there

are a whole host of companies that are

doing this kind of personalization Yahoo

News the biggest news site on the

internet is now personalized different

people get different things Huffington

Post Washington Post New York Times all

flirting with personalization in various

ways and where this this moves us very

quickly toward a world in which the

Internet is showing us what it thinks we

want to see but not necessarily what we

need to see as Eric Schmidt said it’ll

be very hard for people to watch or

consume something that is not in some

sense been tailored for them so I do

think this is a problem and I think if

you take all of these filters together

if you take all of these algorithms you

get what I call a filter bubble and your

filter bubble is kind of your own

personal unique universe of information

that you live in online and what’s in

your filter bubble depends on who you

are and it depends on what you do but

the thing is that you don’t decide what

gets in and more importantly you don’t

actually see what gets edited

out so one of the problems with the

filter bubble was discovered by some

researchers at Netflix and they were

looking at the Netflix queues and they

noticed something kind of funny that a

lot of us probably have noticed which is

there are some movies that just sort of

zip right up and out to our houses they

enter the queue they just zip right out

so Ironman zips right out right and

Waiting for Superman can wait for a

really long time what they discovered

was that in our Netflix queues there’s

kind of this epic struggle going on

between our future aspirational selves

and our more impulsive present selves

you know we all want to be someone who

has watched Rashomon but right now we

want to watch Ace Ventura for the 4th

time so the best editing gives us a bit

of both it gives us a little bit of

Justin Bieber and a little bit of

Afghanistan it gives us some information

vegetables it gives us some information

dessert and the challenge with these

kind of algorithmic filters these

personalized filters is that because

they’re mainly looking at what you click

on first you know you don’t it can throw

off that balance and instead of a

balanced information diet you can end up

surrounded by information junk food so

what this suggests is actually that we

may have the story about the Internet

wrong in a broadcast society you know

this is how the founding mythology goes

right in a broadcast society there were

these gatekeepers the editors and they

controlled the flows of information and

Along Came the internet and it swept

them out of the way and it allowed us

all of us to connect together and it was

awesome but that’s not actually what’s

happening right now what we’re seeing is

more of a passing of the torch from

human gatekeepers to algorithmic ones

and the thing is that the algorithms

don’t yet have the kind of embedded

ethics that the editors did so if

algorithms are going to curate the world

for us if they’re going to decide what

we get to see and what we don’t get to

see that we need to make sure that

they’re not just

to relevance we need to make sure that

they also show us things that are

uncomfortable or challenging or

important this is what Ted does right

other points of view and the thing is

and we’ve actually kind of been here

before as a society in 1915 it’s not

like newspapers were sweating a lot

about their civic responsibilities then

people kind of noticed that they were

doing something really important that in

fact you couldn’t have a functioning

democracy if citizens didn’t get a good

flow of information that the newspapers

were critical because they were acting

as the filter and that journalistic

ethics developed it wasn’t perfect but

it got us through the last century and

so now we’re kind of back in 1915 on the

web and we need the new gatekeepers to

encode that kind of responsibility into

the code that they’re writing you know I

know there are a lot of people here from

Facebook and from Google Larry and

Sergey who you know people who have

helped build the web as it is and I’m

grateful for that but we really need to

you to make sure that these algorithms

have encoded in them a sense of the

public life a sense of civic

responsibility we need you to make sure

that they’re transparent enough that we

can see what the rules are that

determine what gets through our filters

and we need you to give us some control

so that we can decide what gets through

and what doesn’t because I think we

really need the internet to be that

thing that we all dreamed of it being we

need it to connect us all together we

need it to introduce us to new ideas and

new people and different perspectives

and it’s not going to do that if it

leaves us all isolated in a web of one

thank you

you