The price of a clean internet Hans Block and Moritz Riesewieck

[This talk contains mature content]

Moritz Riesewieck: On March 23, 2013,

users worldwide
discovered in their news feed

a video of a young girl
being raped by an older man.

Before this video
was removed from Facebook,

it was already shared 16,000 times,

and it was even liked 4,000 times.

This video went viral
and infected the net.

Hans Block: And that was the moment
we asked ourselves

how could something like this
get on Facebook?

And at the same time,
why don’t we see such content more often?

After all, there’s a lot
of revolting material online,

but why do we so rarely see such crap
on Facebook, Twitter or Google?

MR: While image-recognition software

can identify the outlines
of sexual organs,

blood or naked skin in images and videos,

it has immense difficulties
to distinguish pornographic content

from holiday pictures, Adonis statues

or breast-cancer screening campaigns.

It can’t distinguish
Romeo and Juliet dying onstage

from a real knife attack.

It can’t distinguish satire
from propaganda

or irony from hatred,
and so on and so forth.

Therefore, humans are needed to decide

which of the suspicious content
should be deleted,

and which should remain.

Humans whom we know almost nothing about,

because they work in secret.

They sign nondisclosure agreements,

which prohibit them
from talking and sharing

what they see on their screens
and what this work does to them.

They are forced to use code words
in order to hide who they work for.

They are monitored
by private security firms

in order to ensure
that they don’t talk to journalists.

And they are threatened by fines
in case they speak.

All of this sounds
like a weird crime story,

but it’s true.

These people exist,

and they are called content moderators.

HB: We are the directors of the feature
documentary film “The Cleaners,”

and we would like to take you

to a world that many of you
may not know yet.

Here’s a short clip of our film.

(Music)

(Video) Moderator: I need to be anonymous,
because we have a contract signed.

We are not allowed to declare
whom we are working with.

The reason why I speak to you

is because the world should know
that we are here.

There is somebody
who is checking the social media.

We are doing our best
to make this platform

safe for all of them.

Delete.

Ignore.

Delete.

Ignore.

Delete.

Ignore.

Ignore.

Delete.

HB: The so-called content moderators

don’t get their paychecks from Facebook,
Twitter or Google themselves,

but from outsourcing firms
around the world

in order to keep the wages low.

Tens of thousands of young people

looking at everything
we are not supposed to see.

And we are talking about
decapitations, mutilations,

executions, necrophilia,
torture, child abuse.

Thousands of images in one shift –

ignore, delete, day and night.

And much of this work is done in Manila,

where the analog toxic waste
from the Western world

was transported for years
by container ships,

now the digital waste is dumped there
via fiber-optic cable.

And just as the so-called scavengers

rummage through gigantic tips
on the edge of the city,

the content moderators click their way
through an endless toxic ocean

of images and videos and all manner
of intellectual garbage,

so that we don’t have to look at it.

MR: But unlike the wounds
of the scavengers,

those of the content moderators
remain invisible.

Full of shocking and disturbing content,

these pictures and videos
burrow into their memories

where, at any time,
they can have unpredictable effects:

eating disorders, loss of libido,

anxiety disorders, alcoholism,

depression, which can even
lead to suicide.

The pictures and videos infect them,

and often never let them go again.

If they are unlucky, they develop
post-traumatic stress disorders,

like soldiers after war missions.

In our film, we tell the story
of a young man

who had to monitor livestreams
of self-mutilations and suicide attempts,

again and again,

and who eventually
committed suicide himself.

It’s not an isolated case,
as we’ve been told.

This is the price all of us pay

for our so-called clean
and safe and “healthy”

environments on social media.

Never before in the history of mankind

has it been easier to reach
millions of people around the globe

in a few seconds.

What is posted on social media
spreads so quickly,

becomes viral and excites the minds
of people all around the globe.

Before it is deleted,

it is often already too late.

Millions of people
have already been infected

with hatred and anger,

and they either become active online,

by spreading or amplifying hatred,

or they take to the streets
and take up arms.

HB: Therefore, an army
of content moderators

sit in front of a screen
to avoid new collateral damage.

And they are deciding,
as soon as possible,

whether the content
stays on the platform – ignore;

or disappears – delete.

But not every decision is as clear

as the decision about a child-abuse video.

What about controversial content,
ambivalent content,

uploaded by civil rights activists
or citizen journalists?

The content moderators
often decide on such cases

at the same speed as the [clear] cases.

MR: We will show you a video now,

and we would like to ask you to decide:

Would you delete it,

or would you not delete it?

(Video) (Air strike sounds)

(Explosion)

(People speaking in Arabic)

MR: Yeah, we did some blurring for you.

A child would potentially
be dangerously disturbed

and extremely frightened by such content.

So, you rather delete it?

But what if this video could help
investigate the war crimes in Syria?

What if nobody would have heard
about this air strike,

because Facebook, YouTube, Twitter
would have decided to take it down?

Airwars, a nongovernmental
organization based in London,

tries to find those videos
as quickly as possible

whenever they are uploaded
to social media,

in order to archive them.

Because they know, sooner or later,

Facebook, YouTube, Twitter
would take such content down.

People armed with their mobile phones

can make visible what journalists
often do not have access to.

Civil rights groups often
do not have any better option

to quickly make their recordings
accessible to a large audience

than by uploading them to social media.

Wasn’t this the empowering potential
the World Wide Web should have?

Weren’t these the dreams

people in its early stages had
about the World Wide Web?

Can’t pictures and videos like these

persuade people who have become
insensitive to facts

to rethink?

HB: But instead, everything
that might be disturbing is deleted.

And there’s a general shift in society.

Media, for example, more and more often
use trigger warnings

at the top of articles

which some people may perceive
as offensive or troubling.

Or more and more students
at universities in the United States

demand the banishment of antique classics

which depict sexual violence or assault
from the curriculum.

But how far should we go with that?

Physical integrity is guaranteed
as a human right

in constitutions worldwide.

In the Charter of Fundamental Rights
of the European Union,

this right expressly applies
to mental integrity.

But even if the potentially
traumatic effect

of images and videos is hard to predict,

do we want to become so cautious

that we risk losing
social awareness of injustice?

So what to do?

Mark Zuckerberg recently stated
that in the future,

the users, we, or almost everybody,

will decide individually

what they would like to see
on the platform,

by personal filter settings.

So everyone could easily claim
to remain undisturbed

by images of war
or other violent conflicts, like …

MR: I’m the type of guy
who doesn’t mind seeing breasts

and I’m very interested in global warming,

but I don’t like war so much.

HB: Yeah, I’m more the opposite,

I have zero interest in naked breasts
or naked bodies at all.

But why not guns? I like guns, yes.

MR: Come on, if we don’t share
a similar social consciousness,

how shall we discuss social problems?

How shall we call people to action?

Even more isolated bubbles would emerge.

One of the central questions is:
“How, in the future,

freedom of expression will be weighed
against the people’s need for protection.”

It’s a matter of principle.

Do we want to design
an either open or closed society

for the digital space?

At the heart of the matter
is “freedom versus security.”

Facebook has always wanted to be
a “healthy” platform.

Above all, users should feel
safe and secure.

It’s the same choice of words

the content moderators
in the Philippines used

in a lot of our interviews.

(Video) The world
that we are living in right now,

I believe, is not really healthy.

(Music)

In this world, there is really
an evil who exists.

(Music)

We need to watch for it.

(Music)

We need to control it – good or bad.

(Music)

[Look up, Young man! –God]

MR: For the young content moderators
in the strictly Catholic Philippines,

this is linked to a Christian mission.

To counter the sins of the world

which spread across the web.

“Cleanliness is next to godliness,”

is a saying everybody
in the Philippines knows.

HB: And others motivate themselves

by comparing themselves
with their president, Rodrigo Duterte.

He has been ruling
the Philippines since 2016,

and he won the election
with the promise: “I will clean up.”

And what that means is eliminating
all kinds of problems

by literally killing people on the streets

who are supposed to be criminals,
whatever that means.

And since he was elected,

an estimated 20,000 people
have been killed.

And one moderator in our film says,

“What Duterte does on the streets,

I do for the internet.”

And here they are,
our self-proclaimed superheroes,

who enforce law and order
in our digital world.

They clean up,
they polish everything clean,

they free us from everything evil.

Tasks formerly reserved
to state authorities

have been taken over
by college graduates in their early 20s,

equipped with
three- to five-day training –

this is the qualification –

who work on nothing less
than the world’s rescue.

MR: National sovereignties
have been outsourced to private companies,

and they pass on their
responsibilities to third parties.

It’s an outsourcing of the outsourcing
of the outsourcing,

which takes place.

With social networks,

we are dealing with a completely
new infrastructure,

with its own mechanisms,

its own logic of action

and therefore, also, its own new dangers,

which had not yet existed
in the predigitalized public sphere.

HB: When Mark Zuckerberg
was at the US Congress

or at the European Parliament,

he was confronted
with all kinds of critics.

And his reaction was always the same:

“We will fix that,

and I will follow up on that
with my team.”

But such a debate shouldn’t be held
in back rooms of Facebook,

Twitter or Google –

such a debate should be openly discussed
in new, cosmopolitan parliaments,

in new institutions
that reflect the diversity of people

contributing to a utopian project
of a global network.

And while it may seem impossible
to consider the values

of users worldwide,

it’s worth believing

that there’s more that connects us
than separates us.

MR: Yeah, at a time
when populism is gaining strength,

it becomes popular
to justify the symptoms,

to eradicate them,

to make them invisible.

This ideology is spreading worldwide,

analog as well as digital,

and it’s our duty to stop it

before it’s too late.

The question of freedom and democracy

must not only have these two options.

HB: Delete.

MR: Or ignore.

HB: Thank you very much.

(Applause)

[本演讲包含成人内容]

Moritz Riesewieck:2013 年 3 月 23 日,

全球用户
在他们的新闻提要中发现

了一个年轻女孩
被一名年长男子强奸的视频。

这段视频
在被 Facebook 下架之前,

已经被分享了 16000 次,

甚至被点赞了 4000 次。

这段视频
在网上疯传并感染了网络。

汉斯·布洛克:那一刻
我们问自己

,这样的东西怎么会出现
在 Facebook 上?

同时,
为什么我们不经常看到这样的内容?

毕竟,网上有很多
令人反感的材料,

但为什么我们很少
在 Facebook、Twitter 或 Google 上看到这样的废话?

MR:虽然图像识别软件

可以识别

图像和视频中性器官、血液或裸露皮肤的轮廓,


将色情内容

与假日照片、阿多尼斯雕像

或乳腺癌筛查活动区分开来却非常困难。

它无法将
罗密欧与朱丽叶在舞台上的死亡

与真正的刀攻击区分开来。

它无法区分讽刺
与宣传

或讽刺与仇恨,
等等。

因此,需要人工来决定

哪些可疑内容
应该删除

,哪些应该保留。

我们几乎一无所知的人类,

因为他们秘密工作。

他们签署了保密协议

,禁止
他们谈论和分享

他们在屏幕上看到的内容
以及这项工作对他们的影响。

他们被迫使用
暗语来隐藏他们为谁工作。

他们
受到私人保安公司的监控,


确保他们不与记者交谈。

如果他们说话,他们会受到罚款的威胁

所有这些听起来
像是一个奇怪的犯罪故事,

但这是真的。

这些人存在

,他们被称为内容版主。

HB:我们是
故事片《清洁工》(The Cleaners)的导演

,我们想带你

进入一个你们很多人
可能还不知道的世界。

这是我们电影的一个短片。

(音乐)

(视频) 主持人:我需要匿名,
因为我们签了合同。

我们不得宣布
与谁合作。

我和你说话的原因

是因为世界应该
知道我们在这里。

有人在查看社交媒体。

我们正在尽最大
努力使这个平台

对所有人都安全。

删除。

忽略。

删除。

忽略。

删除。

忽略。

忽略。

删除。

HB:所谓的内容版主

不是从 Facebook、
Twitter 或 Google 自己拿薪水,

而是从
世界各地

的外包公司那里拿钱,以保持低工资。

成千上万的年轻人

看着
我们不应该看到的一切。

我们谈论的是
斩首、肢解、

处决、恋尸癖、
酷刑、虐待儿童。

一个班次中的数千张图像-

忽略,删除,白天和黑夜。

大部分工作是在马尼拉完成的

,来自西方世界的模拟有毒废物

由集装箱船运输多年,

现在数字废物通过光纤电缆倾倒在那里

就像所谓的拾荒者在城市边缘

的巨大提示中翻找一样

,内容版主
在无尽的有毒

图像和视频海洋以及
各种智力垃圾中点击他们的方式,

这样我们就不必看 在它。

MR:但与拾荒者的伤口不同

,内容版主的伤口
仍然是无形的。 这些图片和视频

充满了令人震惊和令人不安的内容,

深入他们的

记忆,在任何时候,
它们都可能产生不可预测的影响:

饮食失调、性欲减退、

焦虑症、酗酒、

抑郁,甚至
可能导致自杀。

图片和视频感染了他们,

而且往往再也不会放过他们。

如果他们不走运,他们会
患上创伤后应激障碍,

就像战争任务后的士兵一样。

在我们的电影中,我们讲述
了一个年轻人的故事,

他不得不一次又一次地监控
自残和自杀企图的直播

,最终
他自己自杀了。 正如我们被告知的那样

,这不是一个孤立的案例

这是我们所有人

为社交媒体上所谓的清洁
、安全和“健康”

环境付出的代价。

在人类历史上,在几秒钟内

接触到全球数百万人从未如此简单

社交媒体上发布的内容
传播得如此之快,

成为病毒式传播,并激发
了全球人们的心。

在它被删除之前,

往往已经为时已晚。

数百万人
已经感染

了仇恨和愤怒

,他们要么

通过传播或放大仇恨

而在网上活跃起来,要么走上
街头拿起武器。

HB:因此,一
大群内容版主

坐在屏幕前
以避免新的附带损害。

他们正在
尽快

决定内容是否
留在平台上——忽略;

或消失——删除。

但并不是每个决定都像

关于虐待儿童视频的决定一样清晰。

民权活动家
或公民记者上传的有争议、矛盾的内容呢?

内容版主
通常

以与 [清除] 案件相同的速度决定此类案件。

MR:我们现在给你看一个视频

,我们想请你决定:

你删除它,

还是不删除它?

(视频)(空袭声)

(爆炸声)

(说阿拉伯语的人)

MR:是的,我们为你做了一些模糊处理。

儿童可能

受到此类内容的危险干扰和极度恐惧。

所以,你宁愿删除它?

但是,如果这段视频可以帮助
调查叙利亚的战争罪行呢?

如果没有人听说
过这次空袭,

因为 Facebook、YouTube、Twitter
会决定将其撤下怎么办?

总部位于伦敦的非政府组织 Airwars

试图在这些视频上传到社交媒体时尽快找到这些视频

,以便将其存档。

因为他们知道,

Facebook、YouTube、Twitter 迟早
会删除此类内容。

拥有手机的人

可以让记者
经常看不到的东西变得可见。

民权组织通常
没有比上传到社交媒体更好的选择

来快速让大量观众访问他们的录音

这难道不是万维网应有的赋能
潜力吗?

这些不就是

人们在早期阶段
对万维网的梦想吗?

像这样的图片和视频就不能说服

对事实麻木不仁的人

重新思考吗?

HB:但是,所有
可能令人不安的东西都会被删除。

社会普遍发生了转变。

例如,媒体越来越经常在文章顶部
使用触发警告

,有些人可能认为这些警告
令人反感或令人不安。

或者,越来越多
的美国大学学生

要求将描绘性暴力或性侵犯的古典经典作品

从课程中删除。

但是我们应该走多远呢?

身体完整
作为一项人权

在全世界的宪法中得到保障。

在《欧盟基本权利宪章》中

这项权利明确适用
于精神完整性。

但是,即使

图像和视频的潜在创伤性影响难以预测,

我们是否要变得如此谨慎

,以致可能失去
社会对不公正的认识?

那么该怎么办?

马克扎克伯格最近表示
,在未来

,用户,我们,或者几乎每个人,

将通过个人过滤器设置单独决定

他们希望在平台上看到什么

所以每个人都可以很容易地声称
自己

不受战争
或其他暴力冲突的影响,比如……

MR:我是
那种不介意看到乳房的人

,我对全球变暖很感兴趣,

但我不 不太喜欢战争。

HB:是的,我更相反,

我对裸胸
或裸体的兴趣为零。

但为什么不是枪呢? 我喜欢枪,是的。

MR:拜托,如果我们没有
相似的社会意识,

我们怎么讨论社会问题?

我们如何呼吁人们采取行动?

甚至会出现更多孤立的泡沫。

核心问题之一是:
“在未来,

如何权衡言论自由
与人民对保护的需求。”

这是一个原则问题。

我们想为数字空间设计
一个开放或封闭的社会

吗?

问题的核心
是“自由与安全”。

Facebook一直想成为
一个“健康”的平台。

最重要的是,用户应该感到
安全和有保障。

菲律宾

的内容版主

在我们的许多采访中都使用了相同的措辞。

(视频)我
相信,我们现在生活的世界

并不健康。

(音乐)

在这个世界上,真的有
一个邪恶的存在。

(音乐)

我们需要注意它。

(音乐)

我们需要控制它——无论好坏。

(音乐)

[抬头,年轻人! –God]

MR:对于
信奉天主教的菲律宾的年轻内容版主来说,

这与基督教传教息息相关。

对抗

遍布网络的世界罪恶。

“清洁仅次于敬虔”,这

是菲律宾人都知道的一句话

HB:而其他人则

通过将
自己与总统罗德里戈·杜特尔特(Rodrigo Duterte)进行比较来激励自己。

他自 2016 年以来一直
统治菲律宾,


以“我会清理干净”的承诺赢得了选举。

意味着通过在街上

杀死本应是罪犯的人来消除各种问题,
无论这意味着什么。

And since he was elected,

an estimated 20,000 people
have been killed.

我们电影中的一位主持人说,

“杜特尔特在街上所做的,

我是为互联网做的。”

他们就是
我们自称的超级英雄,

他们
在我们的数字世界中执行法律和秩序。

他们
清理,他们擦亮一切,

他们把我们从一切邪恶中解放出来。

以前保留
给国家当局的任务

已经被
20 岁出头的大学毕业生接手,他们

接受了
三到五天的培训——

这是资格——

他们的工作
不亚于世界救援。

MR:国家主权
已外包给私营公司

,他们将
责任转嫁给第三方。

它是外包
的外包,外包

的发生。

对于社交网络,

我们正在处理一个
全新的基础设施,

它有它自己的机制、

它自己的行动逻辑

,因此也有它自己的新危险,


在数字化前的公共领域中还不存在。

HB:马克扎克
伯格在美国国会

或欧洲议会时,


遇到各种各样的批评。

他的反应总是一样的:

“我们会解决这个问题

,我会
和我的团队一起跟进。”

但这样的辩论不应该
在 Facebook、

Twitter 或谷歌

的密室里进行——这样的辩论应该
在新的世界性议会中公开讨论,

在新机构中公开讨论,这些机构
反映了为全球乌托邦项目做出贡献的人的多样性。

网络。

尽管
考虑

全球用户

的价值似乎是不可能的,但值得相信的

是,将我们联系在一起的东西比将我们分开的东西更多

MR: 是的,在
民粹主义愈演愈烈的时候

,为症状辩护

、根除它们

、让它们隐形变得流行起来。

这种意识形态正在全球范围内传播,无论是

模拟的还是数字的

,我们有责任

在为时已晚之前阻止它。

自由与民主的问题

绝不能只有这两个选项。

HB:删除。

MR:或者忽略。

HB:非常感谢。

(掌声)