How we can eliminate child sexual abuse material from the internet Julie Cordua

[This talk contains mature content]

Five years ago,

I received a phone call
that would change my life.

I remember so vividly that day.

It was about this time of year,

and I was sitting in my office.

I remember the sun
streaming through the window.

And my phone rang.

And I picked it up,

and it was two federal agents,
asking for my help

in identifying a little girl

featured in hundreds of child
sexual abuse images they had found online.

They had just started working the case,

but what they knew

was that her abuse had been broadcast
to the world for years

on dark web sites dedicated
to the sexual abuse of children.

And her abuser was incredibly
technologically sophisticated:

new images and new videos every few weeks,

but very few clues as to who she was

or where she was.

And so they called us,

because they had heard
we were a new nonprofit

building technology
to fight child sexual abuse.

But we were only two years old,

and we had only worked
on child sex trafficking.

And I had to tell them

we had nothing.

We had nothing that could
help them stop this abuse.

It took those agents another year

to ultimately find that child.

And by the time she was rescued,

hundreds of images and videos
documenting her rape had gone viral,

from the dark web

to peer-to-peer networks,
private chat rooms

and to the websites you and I use

every single day.

And today, as she struggles to recover,

she lives with the fact
that thousands around the world

continue to watch her abuse.

I have come to learn
in the last five years

that this case is far from unique.

How did we get here as a society?

In the late 1980s, child pornography –

or what it actually is,
child sexual abuse material –

was nearly eliminated.

New laws and increased prosecutions
made it simply too risky

to trade it through the mail.

And then came the internet,
and the market exploded.

The amount of content in circulation today

is massive and growing.

This is a truly global problem,

but if we just look at the US:

in the US alone last year,

more than 45 million images and videos
of child sexual abuse material

were reported to the National Center
for Missing and Exploited Children,

and that is nearly double
the amount the year prior.

And the details behind these numbers
are hard to contemplate,

with more than 60 percent of the images
featuring children younger than 12,

and most of them including
extreme acts of sexual violence.

Abusers are cheered on in chat rooms
dedicated to the abuse of children,

where they gain rank and notoriety

with more abuse and more victims.

In this market,

the currency has become
the content itself.

It’s clear that abusers have been quick
to leverage new technologies,

but our response as a society has not.

These abusers don’t read
user agreements of websites,

and the content doesn’t honor
geographic boundaries.

And they win when we look
at one piece of the puzzle at a time,

which is exactly how
our response today is designed.

Law enforcement works in one jurisdiction.

Companies look at just their platform.

And whatever data they learn along the way

is rarely shared.

It is so clear that this
disconnected approach is not working.

We have to redesign
our response to this epidemic

for the digital age.

And that’s exactly
what we’re doing at Thorn.

We’re building the technology
to connect these dots,

to arm everyone on the front lines –

law enforcement, NGOs and companies –

with the tools they need
to ultimately eliminate

child sexual abuse material
from the internet.

Let’s talk for a minute –

(Applause)

Thank you.

(Applause)

Let’s talk for a minute
about what those dots are.

As you can imagine,
this content is horrific.

If you don’t have to look at it,
you don’t want to look at it.

And so, most companies
or law enforcement agencies

that have this content

can translate every file
into a unique string of numbers.

This is called a “hash.”

It’s essentially a fingerprint

for each file or each video.

And what this allows them to do
is use the information in investigations

or for a company to remove
the content from their platform,

without having to relook
at every image and every video each time.

The problem today, though,

is that there are hundreds
of millions of these hashes

sitting in siloed databases
all around the world.

In a silo,

it might work for the one agency
that has control over it,

but not connecting this data means
we don’t know how many are unique.

We don’t know which ones represent
children who have already been rescued

or need to be identified still.

So our first, most basic premise
is that all of this data

must be connected.

There are two ways where this data,
combined with software on a global scale,

can have transformative
impact in this space.

The first is with law enforcement:

helping them identify new victims faster,

stopping abuse

and stopping those producing this content.

The second is with companies:

using it as clues to identify
the hundreds of millions of files

in circulation today,

pulling it down

and then stopping the upload
of new material before it ever goes viral.

Four years ago,

when that case ended,

our team sat there,
and we just felt this, um …

… deep sense of failure,
is the way I can put it,

because we watched that whole year

while they looked for her.

And we saw every place
in the investigation

where, if the technology
would have existed,

they would have found her faster.

And so we walked away from that

and we went and we did
the only thing we knew how to do:

we began to build software.

So we’ve started with law enforcement.

Our dream was an alarm bell on the desks
of officers all around the world

so that if anyone dare post
a new victim online,

someone would start
looking for them immediately.

I obviously can’t talk about
the details of that software,

but today it’s at work in 38 countries,

having reduced the time it takes
to get to a child

by more than 65 percent.

(Applause)

And now we’re embarking
on that second horizon:

building the software to help companies
identify and remove this content.

Let’s talk for a minute
about these companies.

So, I told you – 45 million images
and videos in the US alone last year.

Those come from just 12 companies.

Twelve companies, 45 million files
of child sexual abuse material.

These come from those companies
that have the money

to build the infrastructure that it takes
to pull this content down.

But there are hundreds of other companies,

small- to medium-size companies
around the world,

that need to do this work,

but they either: 1) can’t imagine that
their platform would be used for abuse,

or 2) don’t have the money to spend
on something that is not driving revenue.

So we went ahead and built it for them,

and this system now gets smarter
with the more companies that participate.

Let me give you an example.

Our first partner, Imgur –
if you haven’t heard of this company,

it’s one of the most visited
websites in the US –

millions of pieces of user-generated
content uploaded every single day,

in a mission to make the internet
a more fun place.

They partnered with us first.

Within 20 minutes
of going live on our system,

someone tried to upload
a known piece of abuse material.

They were able to stop it,
they pull it down,

they report it to the National Center
for Missing and Exploited Children.

But they went a step further,

and they went and inspected the account
of the person who had uploaded it.

Hundreds more pieces
of child sexual abuse material

that we had never seen.

And this is where we start
to see exponential impact.

We pull that material down,

it gets reported to the National Center
for Missing and Exploited Children

and then those hashes
go back into the system

and benefit every other company on it.

And when the millions of hashes we have
lead to millions more and, in real time,

companies around the world are identifying
and pulling this content down,

we will have dramatically increased
the speed at which we are removing

child sexual abuse material
from the internet around the world.

(Applause)

But this is why it can’t just be
about software and data,

it has to be about scale.

We have to activate thousands of officers,

hundreds of companies around the world

if technology will allow us
to outrun the perpetrators

and dismantle the communities
that are normalizing child sexual abuse

around the world today.

And the time to do this is now.

We can no longer say we don’t know
the impact this is having on our children.

The first generation of children
whose abuse has gone viral

are now young adults.

The Canadian Centre for Child Protection

just did a recent study
of these young adults

to understand the unique trauma
they try to recover from,

knowing that their abuse lives on.

Eighty percent of these young adults
have thought about suicide.

More than 60 percent
have attempted suicide.

And most of them live
with the fear every single day

that as they walk down the street
or they interview for a job

or they go to school

or they meet someone online,

that that person has seen their abuse.

And the reality came true
for more than 30 percent of them.

They had been recognized
from their abuse material online.

This is not going to be easy,

but it is not impossible.

Now it’s going to take the will,

the will of our society

to look at something
that is really hard to look at,

to take something out of the darkness

so these kids have a voice;

the will of companies to take action
and make sure that their platforms

are not complicit in the abuse of a child;

the will of governments to invest
with their law enforcement

for the tools they need to investigate
a digital first crime,

even when the victims
cannot speak for themselves.

This audacious commitment
is part of that will.

It’s a declaration of war
against one of humanity’s darkest evils.

But what I hang on to

is that it’s actually
an investment in a future

where every child can simply be a kid.

Thank you.

(Applause)

【本讲含成熟内容】

五年前,

接到
一个改变我人生的电话。

那天我记得那么清楚。

大约是每年的这个时候

,我正坐在办公室里。

我记得
阳光透过窗户洒进来。

而我的电话响了。

我把它捡起来了

,是两个联邦特工,
要求我

帮助识别

他们在网上发现的数百张儿童性虐待图片中出现的一个小女孩。

他们刚刚开始处理此案,

但他们所知道的

是,多年来,她的虐待行为已经在

专门
针对儿童性虐待的黑暗网站上向全世界传播。

她的施虐者在
技术上非常复杂:

每隔几周就会有新的图像和新的视频,

但关于她是谁

或她在哪里的线索却很少。

所以他们打电话给我们,

因为他们听说
我们是一种新的非营利性

建筑技术,
可以打击儿童性虐待。

但我们只有两岁

,我们只
从事儿童性交易。

我不得不告诉他们

我们什么都没有。

我们没有任何东西可以
帮助他们阻止这种虐待。

这些特工又花了一年时间

才最终找到那个孩子。

到她获救时,

数百张
记录她被强奸的图像和视频已经在网上疯传,

从暗网

到点对点网络、
私人聊天室

以及你我每天使用的网站

而今天,当她努力康复时,

她生活在这样一个事实
中:世界各地成千上万的人

继续观看她的虐待行为。

在过去的五年里

,我了解到这个案例远非独一无二。

作为一个社会,我们是如何来到这里的?

在 1980 年代后期,儿童色情——

或者实际上是
儿童性虐待材料

——几乎被淘汰了。

新的法律和越来越多的起诉
使得

通过邮件交易它的风险太大了。

然后是互联网
,市场爆炸式增长。

今天流通的内容

数量庞大且不断增长。

这是一个真正的全球性问题,

但如果我们只看美国:

仅在去年,仅在美国,就有

超过 4500 万张
儿童性虐待图片和视频

被报告给国家
失踪和受剥削儿童中心

,这就是 几乎
是前一年的两倍。

而这些数字背后的
细节难以想象

,超过 60% 的图片都是
12 岁以下的儿童,

其中大部分包括
极端的性暴力行为。

虐待者在专门讨论虐待儿童的聊天室里受到欢呼,在

那里他们因

更多的虐待和更多的受害者而获得地位和恶名。

在这个市场上

,货币已经
成为内容本身。

很明显,施虐者很快
就利用了新技术,

但我们作为一个社会的反应却没有。

这些滥用者不阅读
网站的用户协议,

并且内容不尊重
地理边界。

当我们一次
只看一块拼图时,他们就赢了,

这正是
我们今天的反应设计方式。

执法工作在一个司法管辖区工作。

公司只关注他们的平台。

他们在此过程中学到的任何数据

都很少被共享。

很明显,这种
不连贯的方法是行不通的。

我们必须为数字时代重新设计
对这一流行病的应对措施


正是我们在 Thorn 所做的。

我们正在构建
连接这些点的技术,

为前线的每个人——

执法部门、非政府组织和公司——

提供
他们最终消除互联网上

儿童性虐待材料所需的工具

让我们聊一会——

(掌声)

谢谢。

(掌声)

让我们
谈谈这些点是什么。

你可以想象,
这个内容是可怕的。

如果你不必看它,
你就不想看它。

因此,大多数

拥有此内容的公司或执法机构都

可以将每个文件
转换为唯一的数字字符串。

这称为“哈希”。

它本质上

是每个文件或每个视频的指纹。

这允许他们做的
是在调查中使用这些信息,

或者让公司
从他们的平台上删除内容,

而不必
每次都重新查看每张图片和每段视频。

然而,今天的问题

是,世界各地的孤立数据库中有
数亿个这样的哈希值

在孤岛中,

它可能适用于控制它的一个机构

但不连接这些数据意味着
我们不知道有多少是独一无二的。

我们不知道哪些
代表已经获救

或仍需确认身份的儿童。

所以我们的第一个,最基本的前提
是所有这些数据都

必须连接起来。

这些数据
与全球范围内的软件相结合,

可以通过两种方式
在该领域产生变革性影响。

首先是执法部门:

帮助他们更快地识别新的受害者,

制止滥用行为

并制止那些制作此类内容的人。

第二个是与公司有关的:

使用它作为线索来识别

当今流通的数亿个文件,

将其删除

,然后在
新材料病毒传播之前停止上传。

四年前

,那个案子结束的时候,

我们的团队坐在那里
,我们只是觉得,嗯

……深深的失败感
,我可以这么说,

因为我们

看着他们寻找的那一整年 她。

我们在调查中看到了每一个

地方,如果
技术存在的话,

他们会更快地找到她。

所以我们离开了那个

,我们去了,我们做
了我们唯一知道该怎么做的事情:

我们开始构建软件。

所以我们从执法开始。

我们的梦想是在世界各地警官的办公桌上敲响警钟,

这样如果有人敢在
网上发布新的受害者,

就会有人立即开始
寻找他们。

我显然无法谈论
该软件的细节,

但如今它已在 38 个国家/地区运行,

将获得孩子所需的时间减少了

65% 以上。

(掌声

)现在我们
开始了第二个视野:

构建软件来帮助公司
识别和删除这些内容。

让我们
谈谈这些公司。

所以,我告诉过你——
去年仅在美国就有 4500 万张图片和视频。

这些仅来自 12 家公司。

12 家公司,4500 万
份儿童性虐待材料文件。

这些来自那些

有钱建立基础设施
以将这些内容拉下来的公司。

但是世界上还有数百家其他公司,

中小型公司

,需要做这项工作,

但他们要么:1)无法想象
他们的平台会被滥用,

要么 2)不要'
没有钱花在不增加收入的事情上。

所以我们继续为他们构建它,随着更多公司的参与

,这个系统现在变得更加智能

让我给你举个例子。

我们的第一个合作伙伴 Imgur——
如果你还没有听说过这家公司,

它是美国访问量最大的
网站之一——每天上传

数百万条用户生成的内容

,其使命是让互联网
成为 更有趣的地方。

他们首先与我们合作。

在我们的系统上线后的 20 分钟内,

有人试图
上传已知的滥用材料。

他们能够阻止它,
他们把它拉下来,他们把

它报告给国家
失踪和受剥削儿童中心。

但是他们更进一步

,他们去检查
了上传它的人的帐户。

数百件我们从未见过
的儿童性虐待

材料。

这就是我们
开始看到指数影响的地方。

我们将这些材料撤下,

然后将其报告给国家
失踪和受剥削儿童中心

,然后这些哈希值
会返回到系统中,

并使所有其他公司受益。

当数百万的哈希值
导致数以百万计的哈希值越来越多时,

世界各地的公司都在实时识别
并删除这些内容,

我们将大大
加快从互联网上删除

儿童性虐待材料的速度
世界。

(掌声)

但这就是为什么它不能只
关乎软件和数据,

它必须关乎规模。

如果技术能让
我们超越肇事者

并摧毁当今世界各地
使儿童性虐待正常化的社区,我们必须动员世界各地成千上万的官员、数百家公司

现在是时候这样做了。

我们不能再说我们不知道
这对我们的孩子有什么影响。 受到虐待

的第一代儿童

现在是年轻人。

加拿大儿童保护中心

最近对这些年轻人进行了一项研究

以了解
他们试图从中恢复的独特创伤,

因为他们知道他们的虐待会持续存在。

这些年轻人中有 80%
曾想过自杀。

超过 60% 的人
曾企图自杀。

他们中的大多数人每天都生活
在恐惧中

,当他们走在街上
、面试工作

、上学

或在网上遇到某人时

,那个人看到了他们的虐待。

超过 30% 的人实现了这一目标。

他们已经
从网上的滥用材料中得到了认可。

这并不容易,

但也不是不可能。

现在它需要

我们的意志,我们社会的意志

去看待一些真正难以观察的

东西,把一些东西从黑暗中拿出来,

让这些孩子有发言权;

公司采取行动
并确保其

平台不参与虐待儿童的意愿;

即使受害者无法为自己说话,政府也
愿意为其执法部门

提供调查首次数字犯罪所需的工具

这种大胆的承诺
是这一意愿的一部分。

这是
对人类最黑暗的邪恶之一的宣战。

但我坚持的

是,这实际上
是对未来的投资

,每个孩子都可以简单地成为孩子。

谢谢你。

(掌声)