Dear Facebook this is how youre breaking democracy Yael Eisenstat

Around five years ago,

it struck me that I was losing the ability

to engage with people
who aren’t like-minded.

The idea of discussing hot-button issues
with my fellow Americans

was starting to give me more heartburn

than the times that I engaged
with suspected extremists overseas.

It was starting to leave me feeling
more embittered and frustrated.

And so just like that,

I shifted my entire focus

from global national security threats

to trying to understand
what was causing this push

towards extreme polarization at home.

As a former CIA officer and diplomat

who spent years working
on counterextremism issues,

I started to fear that this was becoming
a far greater threat to our democracy

than any foreign adversary.

And so I started digging in,

and I started speaking out,

which eventually led me
to being hired at Facebook

and ultimately brought me here today

to continue warning you
about how these platforms

are manipulating
and radicalizing so many of us

and to talk about
how to reclaim our public square.

I was a foreign service officer in Kenya

just a few years after
the September 11 attacks,

and I led what some call
“hearts and minds” campaigns

along the Somalia border.

A big part of my job
was to build trust with communities

deemed the most susceptible
to extremist messaging.

I spent hours drinking tea
with outspoken anti-Western clerics

and even dialogued
with some suspected terrorists,

and while many of these engagements
began with mutual suspicion,

I don’t recall any of them
resulting in shouting or insults,

and in some case we even worked together
on areas of mutual interest.

The most powerful tools we had
were to simply listen, learn

and build empathy.

This is the essence
of hearts and minds work,

because what I found again and again
is that what most people wanted

was to feel heard,
validated and respected.

And I believe that’s what most of us want.

So what I see happening online today
is especially heartbreaking

and a much harder problem to tackle.

We are being manipulated
by the current information ecosystem

entrenching so many of us
so far into absolutism

that compromise has become a dirty word.

Because right now,

social media companies like Facebook

profit off of segmenting us
and feeding us personalized content

that both validates
and exploits our biases.

Their bottom line depends
on provoking a strong emotion

to keep us engaged,

often incentivizing the most
inflammatory and polarizing voices,

to the point where finding common ground
no longer feels possible.

And despite a growing chorus of people
crying out for the platforms to change,

it’s clear they will not
do enough on their own.

So governments must define
the responsibility

for the real-world harms being caused
by these business models

and impose real costs
on the damaging effects

they’re having to our public health,
our public square and our democracy.

But unfortunately, this won’t happen
in time for the US presidential election,

so I am continuing to raise this alarm,

because even if one day
we do have strong rules in place,

it will take all of us to fix this.

When I started shifting my focus
from threats abroad

to the breakdown
in civil discourse at home,

I wondered if we could repurpose
some of these hearts and minds campaigns

to help heal our divides.

Our more than 200-year
experiment with democracy works

in large part because we are able
to openly and passionately

debate our ideas for the best solutions.

But while I still deeply believe

in the power of face-to-face
civil discourse,

it just cannot compete

with the polarizing effects
and scale of social media right now.

The people who are sucked
down these rabbit holes

of social media outrage

often feel far harder to break
of their ideological mindsets

than those vulnerable communities
I worked with ever were.

So when Facebook called me in 2018

and offered me this role

heading its elections integrity operations
for political advertising,

I felt I had to say yes.

I had no illusions
that I would fix it all,

but when offered the opportunity

to help steer the ship
in a better direction,

I had to at least try.

I didn’t work directly on polarization,

but I did look at which issues
were the most divisive in our society

and therefore the most exploitable
in elections interference efforts,

which was Russia’s tactic ahead of 2016.

So I started by asking questions.

I wanted to understand
the underlying systemic issues

that were allowing all of this to happen,

in order to figure out how to fix it.

Now I still do believe
in the power of the internet

to bring more voices to the table,

but despite their stated goal
of building community,

the largest social media companies
as currently constructed

are antithetical to the concept
of reasoned discourse.

There’s no way to reward listening,

to encourage civil debate

and to protect people
who sincerely want to ask questions

in a business where optimizing
engagement and user growth

are the two most important
metrics for success.

There’s no incentive
to help people slow down,

to build in enough friction
that people have to stop,

recognize their emotional
reaction to something,

and question their own
assumptions before engaging.

The unfortunate reality is:

lies are more engaging online than truth,

and salaciousness beats out
wonky, fact-based reasoning

in a world optimized
for frictionless virality.

As long as algorithms' goals
are to keep us engaged,

they will continue to feed us the poison
that plays to our worst instincts

and human weaknesses.

And yes, anger, mistrust,

the culture of fear, hatred:

none of this is new in America.

But in recent years,
social media has harnessed all of that

and, as I see it,
dramatically tipped the scales.

And Facebook knows it.

A recent “Wall Street Journal” article

exposed an internal
Facebook presentation from 2018

that specifically points
to the companies' own algorithms

for growing extremist groups'
presence on their platform

and for polarizing their users.

But keeping us engaged
is how they make their money.

The modern information environment
is crystallized around profiling us

and then segmenting us
into more and more narrow categories

to perfect this personalization process.

We’re then bombarded
with information confirming our views,

reinforcing our biases,

and making us feel
like we belong to something.

These are the same tactics
we would see terrorist recruiters

using on vulnerable youth,

albeit in smaller, more localized ways
before social media,

with the ultimate goal
of persuading their behavior.

Unfortunately, I was never empowered
by Facebook to have an actual impact.

In fact, on my second day,
my title and job description were changed

and I was cut out
of decision-making meetings.

My biggest efforts,

trying to build plans

to combat disinformation
and voter suppression in political ads,

were rejected.

And so I lasted just shy of six months.

But here is my biggest takeaway
from my time there.

There are thousands of people at Facebook

who are passionately working on a product

that they truly believe
makes the world a better place,

but as long as the company continues
to merely tinker around the margins

of content policy and moderation,

as opposed to considering

how the entire machine
is designed and monetized,

they will never truly address
how the platform is contributing

to hatred, division and radicalization.

And that’s the one conversation
I never heard happen during my time there,

because that would require
fundamentally accepting

that the thing you built
might not be the best thing for society

and agreeing to alter
the entire product and profit model.

So what can we do about this?

I’m not saying that social media
bears the sole responsibility

for the state that we’re in today.

Clearly, we have deep-seated
societal issues that we need to solve.

But Facebook’s response,
that it is just a mirror to society,

is a convenient attempt
to deflect any responsibility

from the way their platform
is amplifying harmful content

and pushing some users
towards extreme views.

And Facebook could, if they wanted to,

fix some of this.

They could stop amplifying
and recommending the conspiracy theorists,

the hate groups,
the purveyors of disinformation

and, yes, in some cases
even our president.

They could stop using
the same personalization techniques

to deliver political rhetoric
that they use to sell us sneakers.

They could retrain their algorithms

to focus on a metric
other than engagement,

and they could build in guardrails
to stop certain content from going viral

before being reviewed.

And they could do all of this

without becoming what they call
the arbiters of truth.

But they’ve made it clear
that they will not go far enough

to do the right thing
without being forced to,

and, to be frank, why should they?

The markets keep rewarding them,
and they’re not breaking the law.

Because as it stands,

there are no US laws compelling Facebook,
or any social media company,

to protect our public square,

our democracy

and even our elections.

We have ceded the decision-making
on what rules to write and what to enforce

to the CEOs of for-profit
internet companies.

Is this what we want?

A post-truth world
where toxicity and tribalism

trump bridge-building
and consensus-seeking?

I do remain optimistic that we still
have more in common with each other

than the current media
and online environment portray,

and I do believe that having
more perspective surface

makes for a more robust
and inclusive democracy.

But not the way it’s happening right now.

And it bears emphasizing,
I do not want to kill off these companies.

I just want them held
to a certain level of accountability,

just like the rest of society.

It is time for our governments
to step up and do their jobs

of protecting our citizenry.

And while there isn’t
one magical piece of legislation

that will fix this all,

I do believe that governments
can and must find the balance

between protecting free speech

and holding these platforms accountable
for their effects on society.

And they could do so in part
by insisting on actual transparency

around how these recommendation
engines are working,

around how the curation, amplification
and targeting are happening.

You see, I want these companies
held accountable

not for if an individual
posts misinformation

or extreme rhetoric,

but for how their
recommendation engines spread it,

how their algorithms
are steering people towards it,

and how their tools are used
to target people with it.

I tried to make change
from within Facebook and failed,

and so I’ve been using my voice again
for the past few years

to continue sounding this alarm

and hopefully inspire more people
to demand this accountability.

My message to you is simple:

pressure your government representatives

to step up and stop ceding
our public square to for-profit interests.

Help educate your friends and family

about how they’re being
manipulated online.

Push yourselves to engage
with people who aren’t like-minded.

Make this issue a priority.

We need a whole-society
approach to fix this.

And my message to the leaders
of my former employer Facebook is this:

right now, people are using your tools
exactly as they were designed

to sow hatred, division and distrust,

and you’re not just allowing it,
you are enabling it.

And yes, there are lots of great stories

of positive things happening
on your platform around the globe,

but that doesn’t make any of this OK.

And it’s only getting worse
as we’re heading into our election,

and even more concerning,

face our biggest potential crisis yet,

if the results aren’t trusted,
and if violence breaks out.

So when in 2021 you once again say,
“We know we have to do better,”

I want you to remember this moment,

because it’s no longer
just a few outlier voices.

Civil rights leaders, academics,

journalists, advertisers,
your own employees,

are shouting from the rooftops

that your policies
and your business practices

are harming people and democracy.

You own your decisions,

but you can no longer say
that you couldn’t have seen it coming.

Thank you.

大约五年前

,让我震惊的是,我正在失去


志同道合的人交往的能力。

与我的美国同胞讨论热点问题的

想法开始让我

比我
与海外疑似极端分子接触的时候更心痛。

它开始让我感到
更加痛苦和沮丧。

就这样,

我把我的全部注意力

从全球国家安全威胁

转移到试图了解
是什么导致

了国内极端两极分化的趋势。

作为一名多年来致力于反极端主义问题的前中央情报局官员和外交官

我开始担心这比任何外国对手
对我们的民主构成更大的威胁

所以我开始深入研究

,我开始发声,

这最终导致
我被 Facebook 录用,

并最终把我带到了今天

,继续警告你
这些平台

如何操纵
和激进化我们这么多人,

并谈论
如何 夺回我们的公共广场。

在 9 月 11 日袭击事件发生几年后,

我在肯尼亚担任外交官,我在索马里边境领导了一些人所谓的
“全心全意”运动

我工作的很大一部分
是与

被认为最容易
受到极端主义信息影响的社区建立信任。

我花了几个小时
与直言不讳的反西方神职人员喝茶

,甚至
与一些恐怖分子嫌疑人进行对话

,虽然其中许多接触
始于相互猜疑,但

我不记得其中任何一个
导致大喊大叫或侮辱

,在某些情况下,我们甚至 在
共同感兴趣的领域合作。

我们拥有的最强大的工具
就是简单地倾听、学习

和建立同理心。


是全心全意工作的本质,

因为我一次又一次
地发现,大多数人想要的

是被听到、被
认可和被尊重。

我相信这就是我们大多数人想要的。

所以我今天在网上看到的事情
尤其令人心碎

,也是一个更难解决的问题。

我们正
被当前的信息生态系统所

操纵,使我们中的许多
人深陷专制主义

,妥协已成为一个肮脏的词。

因为现在,

像 Facebook 这样的社交媒体

公司从
细分我们并为我们提供个性化内容中获利,这些内容

既验证
了我们的偏见,也利用了我们的偏见。

他们的底线
取决于激发强烈的情感

以保持我们的参与度,

通常会激发最具
煽动性和两极分化的声音

,以至于
无法再找到共同点。

尽管越来越多的人
呼吁改变平台,

但很明显,他们
靠自己做的还不够。

因此,政府必须确定这些商业模式

对现实世界造成的危害的责任

,并对

它们对我们的公共卫生
、公共广场和民主造成的破坏性影响施加实际成本。

但不幸的是,这不会及时发生
在美国总统大选之前,

所以我继续发出警报,

因为即使有一天
我们确实有强有力的规则,

也需要我们所有人来解决这个问题。

当我开始将注意力
从国外

的威胁转移到
国内公民话语的崩溃上时,

我想知道我们是否可以
重新利用这些心灵运动中的一些

来帮助弥合分歧。

我们 200 多年
的民主实验

之所以奏效,很大程度上是因为我们
能够公开、热情地

辩论我们的想法以寻求最佳解决方案。

但是,尽管我仍然深信

面对面
民间话语的力量,

但它无法与

目前社交媒体的两极分化效应和规模相抗衡。

那些陷入

社交媒体愤怒的兔子洞的人

通常比我曾与之合作过的那些脆弱的社区更难
打破他们的意识形态思维方式

因此,当 Facebook 在 2018 年

给我打电话并让我担任

领导其政治广告选举诚信运营
的职位时,

我觉得我必须答应。

我没有
幻想我会解决所有问题,

但是当有

机会帮助引导
船朝着更好的方向前进时,

我至少必须尝试。

我没有直接研究两极分化,

但我确实研究了哪些问题
在我们的社会中最具分裂性

,因此
在选举干预工作中最容易被利用,

这是俄罗斯在 2016 年之前的策略。

所以我从提问开始。

我想了解
导致这一切发生的潜在系统性问题

,以便找出解决方法。

现在我仍然相信
互联网的力量

可以带来更多的声音,

但尽管他们宣称的目标
是建立社区,

但目前构建的最大的社交媒体公司

与理性话语的概念是对立的
。 在优化参与度和用户增长是成功的两个最重要指标的企业中,

没有办法奖励倾听

、鼓励民间辩论

和保护
真诚想提问的人

没有
动力帮助人们放慢速度

,制造足够多的摩擦以
使人们不得不停下来,

认识到他们
对某事的情绪反应,

并在参与之前质疑他们自己的
假设。

不幸的现实是:

谎言在网上比真相更吸引人,在一个为无摩擦病毒传播而优化的世界中

,淫荡击败了
基于事实的不稳定推理

只要算法的目标
是让我们保持参与,

它们就会继续向我们灌输
对我们最坏的本能

和人类弱点起作用的毒药。

是的,愤怒、不信任

、恐惧文化、仇恨:

这些在美国都不是新鲜事。

但近年来,
社交媒体已经利用了所有这些,

并且,在我看来,它
极大地改变了规模。

Facebook 知道这一点。

《华尔街日报》最近的一篇文章

披露
了 2018 年的 Facebook 内部演示文稿,

该演示文稿特别指出
了这些公司自己的算法,

用于增加极端主义团体
在其平台上的存在

并使其用户两极分化。

但让我们参与
是他们赚钱的方式。

现代信息
环境围绕着对我们进行剖析

,然后将我们
划分为越来越窄的类别,

以完善这一个性化过程。

然后,我们会被
信息轰炸,这些信息证实了我们的观点,

强化了我们的偏见

,让我们
觉得我们属于某个东西。

这些策略与
我们看到的恐怖分子招募者

对弱势青年使用的策略相同,

尽管在社交媒体之前以更小、更本地化的方式

最终目标
是说服他们的行为。

不幸的是,
Facebook 从未授权我产生实际影响。

事实上,在第二天,
我的头衔和工作描述都发生了变化

,我被排除
在决策会议之外。

我最大的努力,

试图制定计划

来打击
政治广告中的虚假信息和选民压制,但

被拒绝了。

所以我只坚持了不到六个月。

但这是
我在那里的最大收获。

Facebook 有成千上万的

人正满怀热情地开发

一种他们真正相信
可以让世界变得更美好的产品,

但只要公司
继续只是在

内容政策和审核的边缘修修补补,

而不是考虑

如何 整台机器
都经过设计和货币化,

他们永远无法真正解决
平台如何

助长仇恨、分裂和激进化。

这是
我在那期间从未听过的一次谈话,

因为这需要
从根本上接受

你建造的东西
可能不是对社会最好的东西,

并同意
改变整个产品和盈利模式。

那么我们能做些什么呢?

我并不是说社交媒体

对我们今天所处的状态负有唯一责任。

显然,我们有一些根深蒂固的
社会问题需要解决。

但 Facebook 的回应是
,它只是社会的一面镜子,这

是一种方便的尝试,
旨在转移任何责任

,使其
平台放大有害内容

并将一些用户
推向极端观点的方式。

如果他们愿意,Facebook 可以

解决其中的一些问题。

他们可以停止放大
和推荐阴谋论者

、仇恨团体、
虚假信息的传播者

,是的,在某些情况下
甚至是我们的总统。

他们可以停止
使用相同的个性化技术


传递他们用来向我们推销运动鞋的政治言论。

他们可以重新训练他们的算法

以专注于参与度以外的指标

并且他们可以建立护栏
以阻止某些内容

在被审查之前传播开来。

他们可以在

不成为他们所谓
的真理仲裁者的情况下做到这一切。

但他们已经明确
表示,如果不被迫做正确的事情,他们不会走得足够远

,坦率地说,他们为什么要这样做?

市场一直在奖励他们
,他们并没有触犯法律。

因为就

目前而言,没有美国法律强制 Facebook
或任何社交媒体

公司保护我们的公共广场、

我们的民主

甚至我们的选举。

我们已经将
关于编写什么规则以及执行什么规则的决策权

交给了营利性
互联网公司的首席执行官。

这是我们想要的吗?

一个
毒性和部落主义

胜过建立桥梁
和寻求共识的后真相世界?

我仍然乐观地认为,我们
彼此之间的共同点仍然

比当前媒体
和网络环境所描绘的要多,

而且我相信拥有
更多的视角

会带来更强大
和更具包容性的民主。

但不是现在发生的方式。

值得强调的是,
我不想杀死这些公司。

我只是希望他们
承担一定程度的责任,

就像社会其他人一样。

是时候让我们的
政府加强并做好

保护我们公民的工作了。

虽然没有
一项神奇的

立法可以解决这一切,但

我相信政府
可以而且必须

在保护言论自由

和让这些平台
对其对社会的影响负责之间找到平衡。

他们可以
通过坚持

围绕这些推荐
引擎的工作方式、

围绕策展、放大
和定位如何发生的实际透明度来做到这一点。

你看,我希望这些公司

对个人是否
发布错误信息

或极端言论负责,

而是对他们的
推荐引擎如何传播它、

他们的算法
如何引导人们使用它

以及他们的工具如何使用
它来针对人们负责。

我试图
从 Facebook 内部做出改变,但失败了

,所以在过去的几年里,我一直在用我的声音

继续发出这个警报,

并希望能激励更多的
人要求这种责任感。

我给您的信息很简单:

向您的政府代表施压

,要求他们站出来,停止将
我们的公共广场让给营利性利益。

帮助您的朋友和家人

了解他们是如何在
网上被操纵的。

推动自己
与志同道合的人交往。

将此问题作为优先事项。

我们需要一个全社会的
方法来解决这个问题。

我给
我前雇主 Facebook 的领导人的信息是这样的:

现在,人们正在使用你的工具,
完全按照它们的设计目的

来播下仇恨、分裂和不信任,

而你不仅允许它,
你正在启用它。

是的,有很多

关于在全球各地的平台上发生的积极事情的精彩故事,

但这并不能说明这一切。

随着我们即将进入选举,情况只会变得更糟,更令人担忧的

,如果结果不可信,如果暴力爆发,

我们将面临迄今为止最大的潜在危机

所以当你在 2021 年再次说,
“我们知道我们必须做得更好”时,

我希望你记住这一刻,

因为它不再
只是一些异常的声音。

民权领袖、学者、

记者、广告商、
你自己的员工,

在屋顶上大喊

你的政策
和商业行为

正在伤害人民和民主。

你拥有你的决定,

但你不能
再说你不可能预见到它的到来。

谢谢你。