The price of a clean internet Hans Block and Moritz Riesewieck

[This talk contains mature content]

Moritz Riesewieck: On March 23, 2013,

users worldwide
discovered in their news feed

a video of a young girl
being raped by an older man.

Before this video
was removed from Facebook,

it was already shared 16,000 times,

and it was even liked 4,000 times.

This video went viral
and infected the net.

Hans Block: And that was the moment
we asked ourselves

how could something like this
get on Facebook?

And at the same time,
why don’t we see such content more often?

After all, there’s a lot
of revolting material online,

but why do we so rarely see such crap
on Facebook, Twitter or Google?

MR: While image-recognition software

can identify the outlines
of sexual organs,

blood or naked skin in images and videos,

it has immense difficulties
to distinguish pornographic content

from holiday pictures, Adonis statues

or breast-cancer screening campaigns.

It can’t distinguish
Romeo and Juliet dying onstage

from a real knife attack.

It can’t distinguish satire
from propaganda

or irony from hatred,
and so on and so forth.

Therefore, humans are needed to decide

which of the suspicious content
should be deleted,

and which should remain.

Humans whom we know almost nothing about,

because they work in secret.

They sign nondisclosure agreements,

which prohibit them
from talking and sharing

what they see on their screens
and what this work does to them.

They are forced to use code words
in order to hide who they work for.

They are monitored
by private security firms

in order to ensure
that they don’t talk to journalists.

And they are threatened by fines
in case they speak.

All of this sounds
like a weird crime story,

but it’s true.

These people exist,

and they are called content moderators.

HB: We are the directors of the feature
documentary film “The Cleaners,”

and we would like to take you

to a world that many of you
may not know yet.

Here’s a short clip of our film.

(Music)

(Video) Moderator: I need to be anonymous,
because we have a contract signed.

We are not allowed to declare
whom we are working with.

The reason why I speak to you

is because the world should know
that we are here.

There is somebody
who is checking the social media.

We are doing our best
to make this platform

safe for all of them.

Delete.

Ignore.

Delete.

Ignore.

Delete.

Ignore.

Ignore.

Delete.

HB: The so-called content moderators

don’t get their paychecks from Facebook,
Twitter or Google themselves,

but from outsourcing firms
around the world

in order to keep the wages low.

Tens of thousands of young people

looking at everything
we are not supposed to see.

And we are talking about
decapitations, mutilations,

executions, necrophilia,
torture, child abuse.

Thousands of images in one shift –

ignore, delete, day and night.

And much of this work is done in Manila,

where the analog toxic waste
from the Western world

was transported for years
by container ships,

now the digital waste is dumped there
via fiber-optic cable.

And just as the so-called scavengers

rummage through gigantic tips
on the edge of the city,

the content moderators click their way
through an endless toxic ocean

of images and videos and all manner
of intellectual garbage,

so that we don’t have to look at it.

MR: But unlike the wounds
of the scavengers,

those of the content moderators
remain invisible.

Full of shocking and disturbing content,

these pictures and videos
burrow into their memories

where, at any time,
they can have unpredictable effects:

eating disorders, loss of libido,

anxiety disorders, alcoholism,

depression, which can even
lead to suicide.

The pictures and videos infect them,

and often never let them go again.

If they are unlucky, they develop
post-traumatic stress disorders,

like soldiers after war missions.

In our film, we tell the story
of a young man

who had to monitor livestreams
of self-mutilations and suicide attempts,

again and again,

and who eventually
committed suicide himself.

It’s not an isolated case,
as we’ve been told.

This is the price all of us pay

for our so-called clean
and safe and “healthy”

environments on social media.

Never before in the history of mankind

has it been easier to reach
millions of people around the globe

in a few seconds.

What is posted on social media
spreads so quickly,

becomes viral and excites the minds
of people all around the globe.

Before it is deleted,

it is often already too late.

Millions of people
have already been infected

with hatred and anger,

and they either become active online,

by spreading or amplifying hatred,

or they take to the streets
and take up arms.

HB: Therefore, an army
of content moderators

sit in front of a screen
to avoid new collateral damage.

And they are deciding,
as soon as possible,

whether the content
stays on the platform – ignore;

or disappears – delete.

But not every decision is as clear

as the decision about a child-abuse video.

What about controversial content,
ambivalent content,

uploaded by civil rights activists
or citizen journalists?

The content moderators
often decide on such cases

at the same speed as the [clear] cases.

MR: We will show you a video now,

and we would like to ask you to decide:

Would you delete it,

or would you not delete it?

(Video) (Air strike sounds)

(Explosion)

(People speaking in Arabic)

MR: Yeah, we did some blurring for you.

A child would potentially
be dangerously disturbed

and extremely frightened by such content.

So, you rather delete it?

But what if this video could help
investigate the war crimes in Syria?

What if nobody would have heard
about this air strike,

because Facebook, YouTube, Twitter
would have decided to take it down?

Airwars, a nongovernmental
organization based in London,

tries to find those videos
as quickly as possible

whenever they are uploaded
to social media,

in order to archive them.

Because they know, sooner or later,

Facebook, YouTube, Twitter
would take such content down.

People armed with their mobile phones

can make visible what journalists
often do not have access to.

Civil rights groups often
do not have any better option

to quickly make their recordings
accessible to a large audience

than by uploading them to social media.

Wasn’t this the empowering potential
the World Wide Web should have?

Weren’t these the dreams

people in its early stages had
about the World Wide Web?

Can’t pictures and videos like these

persuade people who have become
insensitive to facts

to rethink?

HB: But instead, everything
that might be disturbing is deleted.

And there’s a general shift in society.

Media, for example, more and more often
use trigger warnings

at the top of articles

which some people may perceive
as offensive or troubling.

Or more and more students
at universities in the United States

demand the banishment of antique classics

which depict sexual violence or assault
from the curriculum.

But how far should we go with that?

Physical integrity is guaranteed
as a human right

in constitutions worldwide.

In the Charter of Fundamental Rights
of the European Union,

this right expressly applies
to mental integrity.

But even if the potentially
traumatic effect

of images and videos is hard to predict,

do we want to become so cautious

that we risk losing
social awareness of injustice?

So what to do?

Mark Zuckerberg recently stated
that in the future,

the users, we, or almost everybody,

will decide individually

what they would like to see
on the platform,

by personal filter settings.

So everyone could easily claim
to remain undisturbed

by images of war
or other violent conflicts, like …

MR: I’m the type of guy
who doesn’t mind seeing breasts

and I’m very interested in global warming,

but I don’t like war so much.

HB: Yeah, I’m more the opposite,

I have zero interest in naked breasts
or naked bodies at all.

But why not guns? I like guns, yes.

MR: Come on, if we don’t share
a similar social consciousness,

how shall we discuss social problems?

How shall we call people to action?

Even more isolated bubbles would emerge.

One of the central questions is:
“How, in the future,

freedom of expression will be weighed
against the people’s need for protection.”

It’s a matter of principle.

Do we want to design
an either open or closed society

for the digital space?

At the heart of the matter
is “freedom versus security.”

Facebook has always wanted to be
a “healthy” platform.

Above all, users should feel
safe and secure.

It’s the same choice of words

the content moderators
in the Philippines used

in a lot of our interviews.

(Video) The world
that we are living in right now,

I believe, is not really healthy.

(Music)

In this world, there is really
an evil who exists.

(Music)

We need to watch for it.

(Music)

We need to control it – good or bad.

(Music)

[Look up, Young man! –God]

MR: For the young content moderators
in the strictly Catholic Philippines,

this is linked to a Christian mission.

To counter the sins of the world

which spread across the web.

“Cleanliness is next to godliness,”

is a saying everybody
in the Philippines knows.

HB: And others motivate themselves

by comparing themselves
with their president, Rodrigo Duterte.

He has been ruling
the Philippines since 2016,

and he won the election
with the promise: “I will clean up.”

And what that means is eliminating
all kinds of problems

by literally killing people on the streets

who are supposed to be criminals,
whatever that means.

And since he was elected,

an estimated 20,000 people
have been killed.

And one moderator in our film says,

“What Duterte does on the streets,

I do for the internet.”

And here they are,
our self-proclaimed superheroes,

who enforce law and order
in our digital world.

They clean up,
they polish everything clean,

they free us from everything evil.

Tasks formerly reserved
to state authorities

have been taken over
by college graduates in their early 20s,

equipped with
three- to five-day training –

this is the qualification –

who work on nothing less
than the world’s rescue.

MR: National sovereignties
have been outsourced to private companies,

and they pass on their
responsibilities to third parties.

It’s an outsourcing of the outsourcing
of the outsourcing,

which takes place.

With social networks,

we are dealing with a completely
new infrastructure,

with its own mechanisms,

its own logic of action

and therefore, also, its own new dangers,

which had not yet existed
in the predigitalized public sphere.

HB: When Mark Zuckerberg
was at the US Congress

or at the European Parliament,

he was confronted
with all kinds of critics.

And his reaction was always the same:

“We will fix that,

and I will follow up on that
with my team.”

But such a debate shouldn’t be held
in back rooms of Facebook,

Twitter or Google –

such a debate should be openly discussed
in new, cosmopolitan parliaments,

in new institutions
that reflect the diversity of people

contributing to a utopian project
of a global network.

And while it may seem impossible
to consider the values

of users worldwide,

it’s worth believing

that there’s more that connects us
than separates us.

MR: Yeah, at a time
when populism is gaining strength,

it becomes popular
to justify the symptoms,

to eradicate them,

to make them invisible.

This ideology is spreading worldwide,

analog as well as digital,

and it’s our duty to stop it

before it’s too late.

The question of freedom and democracy

must not only have these two options.

HB: Delete.

MR: Or ignore.

HB: Thank you very much.

(Applause)