The nightmare videos of childrens YouTube and whats wrong with the internet today James Bridle

I’m James.

I’m a writer and artist,

and I make work about technology.

I do things like draw life-size outlines
of military drones

in city streets around the world,

so that people can start to think
and get their heads around

these really quite hard-to-see
and hard-to-think-about technologies.

I make things like neural networks
that predict the results of elections

based on weather reports,

because I’m intrigued about

what the actual possibilities
of these weird new technologies are.

Last year, I built
my own self-driving car.

But because I don’t
really trust technology,

I also designed a trap for it.

(Laughter)

And I do these things mostly because
I find them completely fascinating,

but also because I think
when we talk about technology,

we’re largely talking about ourselves

and the way that we understand the world.

So here’s a story about technology.

This is a “surprise egg” video.

It’s basically a video of someone
opening up loads of chocolate eggs

and showing the toys inside to the viewer.

That’s it. That’s all it does
for seven long minutes.

And I want you to notice
two things about this.

First of all, this video
has 30 million views.

(Laughter)

And the other thing is,

it comes from a channel
that has 6.3 million subscribers,

that has a total of eight billion views,

and it’s all just more videos like this –

30 million people watching a guy
opening up these eggs.

It sounds pretty weird, but if you search
for “surprise eggs” on YouTube,

it’ll tell you there’s
10 million of these videos,

and I think that’s an undercount.

I think there’s way, way more of these.

If you keep searching, they’re endless.

There’s millions and millions
of these videos

in increasingly baroque combinations
of brands and materials,

and there’s more and more of them
being uploaded every single day.

Like, this is a strange world. Right?

But the thing is, it’s not adults
who are watching these videos.

It’s kids, small children.

These videos are
like crack for little kids.

There’s something about the repetition,

the constant little
dopamine hit of the reveal,

that completely hooks them in.

And little kids watch these videos
over and over and over again,

and they do it for hours
and hours and hours.

And if you try and take
the screen away from them,

they’ll scream and scream and scream.

If you don’t believe me –

and I’ve already seen people
in the audience nodding –

if you don’t believe me, find someone
with small children and ask them,

and they’ll know about
the surprise egg videos.

So this is where we start.

It’s 2018, and someone, or lots of people,

are using the same mechanism that, like,
Facebook and Instagram are using

to get you to keep checking that app,

and they’re using it on YouTube
to hack the brains of very small children

in return for advertising revenue.

At least, I hope
that’s what they’re doing.

I hope that’s what they’re doing it for,

because there’s easier ways
of making ad revenue on YouTube.

You can just make stuff up or steal stuff.

So if you search for really
popular kids' cartoons

like “Peppa Pig” or “Paw Patrol,”

you’ll find there’s millions and millions
of these online as well.

Of course, most of them aren’t posted
by the original content creators.

They come from loads and loads
of different random accounts,

and it’s impossible to know
who’s posting them

or what their motives might be.

Does that sound kind of familiar?

Because it’s exactly the same mechanism

that’s happening across most
of our digital services,

where it’s impossible to know
where this information is coming from.

It’s basically fake news for kids,

and we’re training them from birth

to click on the very first link
that comes along,

regardless of what the source is.

That’s doesn’t seem like
a terribly good idea.

Here’s another thing
that’s really big on kids' YouTube.

This is called the “Finger Family Song.”

I just heard someone groan
in the audience.

This is the “Finger Family Song.”

This is the very first one I could find.

It’s from 2007, and it only has
200,000 views,

which is, like, nothing in this game.

But it has this insanely earwormy tune,

which I’m not going to play to you,

because it will sear itself
into your brain

in the same way that
it seared itself into mine,

and I’m not going to do that to you.

But like the surprise eggs,

it’s got inside kids' heads

and addicted them to it.

So within a few years,
these finger family videos

start appearing everywhere,

and you get versions
in different languages

with popular kids' cartoons using food

or, frankly, using whatever kind
of animation elements

you seem to have lying around.

And once again, there are millions
and millions and millions of these videos

available online in all of these
kind of insane combinations.

And the more time
you start to spend with them,

the crazier and crazier
you start to feel that you might be.

And that’s where I
kind of launched into this,

that feeling of deep strangeness
and deep lack of understanding

of how this thing was constructed
that seems to be presented around me.

Because it’s impossible to know
where these things are coming from.

Like, who is making them?

Some of them appear to be made
of teams of professional animators.

Some of them are just randomly
assembled by software.

Some of them are quite wholesome-looking
young kids' entertainers.

And some of them are from people

who really clearly
shouldn’t be around children at all.

(Laughter)

And once again, this impossibility
of figuring out who’s making this stuff –

like, this is a bot?

Is this a person? Is this a troll?

What does it mean
that we can’t tell the difference

between these things anymore?

And again, doesn’t that uncertainty
feel kind of familiar right now?

So the main way people get views
on their videos –

and remember, views mean money –

is that they stuff the titles
of these videos with these popular terms.

So you take, like, “surprise eggs”

and then you add
“Paw Patrol,” “Easter egg,”

or whatever these things are,

all of these words from other
popular videos into your title,

until you end up with this kind of
meaningless mash of language

that doesn’t make sense to humans at all.

Because of course it’s only really
tiny kids who are watching your video,

and what the hell do they know?

Your real audience
for this stuff is software.

It’s the algorithms.

It’s the software that YouTube uses

to select which videos
are like other videos,

to make them popular,
to make them recommended.

And that’s why you end up with this
kind of completely meaningless mash,

both of title and of content.

But the thing is, you have to remember,

there really are still people within
this algorithmically optimized system,

people who are kind
of increasingly forced to act out

these increasingly bizarre
combinations of words,

like a desperate improvisation artist
responding to the combined screams

of a million toddlers at once.

There are real people
trapped within these systems,

and that’s the other deeply strange thing
about this algorithmically driven culture,

because even if you’re human,

you have to end up behaving like a machine

just to survive.

And also, on the other side of the screen,

there still are these little kids
watching this stuff,

stuck, their full attention grabbed
by these weird mechanisms.

And most of these kids are too small
to even use a website.

They’re just kind of hammering
on the screen with their little hands.

And so there’s autoplay,

where it just keeps playing these videos
over and over and over in a loop,

endlessly for hours and hours at a time.

And there’s so much weirdness
in the system now

that autoplay takes you
to some pretty strange places.

This is how, within a dozen steps,

you can go from a cute video
of a counting train

to masturbating Mickey Mouse.

Yeah. I’m sorry about that.

This does get worse.

This is what happens

when all of these different keywords,

all these different pieces of attention,

this desperate generation of content,

all comes together into a single place.

This is where all those deeply weird
keywords come home to roost.

You cross-breed the finger family video

with some live-action superhero stuff,

you add in some weird,
trollish in-jokes or something,

and suddenly, you come
to a very weird place indeed.

The stuff that tends to upset parents

is the stuff that has kind of violent
or sexual content, right?

Children’s cartoons getting assaulted,

getting killed,

weird pranks that actually
genuinely terrify children.

What you have is software pulling in
all of these different influences

to automatically generate
kids' worst nightmares.

And this stuff really, really
does affect small children.

Parents report their children
being traumatized,

becoming afraid of the dark,

becoming afraid of their favorite
cartoon characters.

If you take one thing away from this,
it’s that if you have small children,

keep them the hell away from YouTube.

(Applause)

But the other thing, the thing
that really gets to me about this,

is that I’m not sure we even really
understand how we got to this point.

We’ve taken all of this influence,
all of these things,

and munged them together in a way
that no one really intended.

And yet, this is also the way
that we’re building the entire world.

We’re taking all of this data,

a lot of it bad data,

a lot of historical data
full of prejudice,

full of all of our worst
impulses of history,

and we’re building that
into huge data sets

and then we’re automating it.

And we’re munging it together
into things like credit reports,

into insurance premiums,

into things like predictive
policing systems,

into sentencing guidelines.

This is the way we’re actually
constructing the world today

out of this data.

And I don’t know what’s worse,

that we built a system
that seems to be entirely optimized

for the absolute worst aspects
of human behavior,

or that we seem
to have done it by accident,

without even realizing
that we were doing it,

because we didn’t really understand
the systems that we were building,

and we didn’t really understand
how to do anything differently with it.

There’s a couple of things I think
that really seem to be driving this

most fully on YouTube,

and the first of those is advertising,

which is the monetization of attention

without any real other variables at work,

any care for the people who are
actually developing this content,

the centralization of the power,
the separation of those things.

And I think however you feel
about the use of advertising

to kind of support stuff,

the sight of grown men in diapers
rolling around in the sand

in the hope that an algorithm
that they don’t really understand

will give them money for it

suggests that this
probably isn’t the thing

that we should be basing
our society and culture upon,

and the way in which
we should be funding it.

And the other thing that’s kind of
the major driver of this is automation,

which is the deployment
of all of this technology

as soon as it arrives,
without any kind of oversight,

and then once it’s out there,

kind of throwing up our hands and going,
“Hey, it’s not us, it’s the technology.”

Like, “We’re not involved in it.”

That’s not really good enough,

because this stuff isn’t
just algorithmically governed,

it’s also algorithmically policed.

When YouTube first started
to pay attention to this,

the first thing they said
they’d do about it

was that they’d deploy
better machine learning algorithms

to moderate the content.

Well, machine learning,
as any expert in it will tell you,

is basically what we’ve started to call

software that we don’t really
understand how it works.

And I think we have
enough of that already.

We shouldn’t be leaving
this stuff up to AI to decide

what’s appropriate or not,

because we know what happens.

It’ll start censoring other things.

It’ll start censoring queer content.

It’ll start censoring
legitimate public speech.

What’s allowed in these discourses,

it shouldn’t be something
that’s left up to unaccountable systems.

It’s part of a discussion
all of us should be having.

But I’d leave a reminder

that the alternative isn’t
very pleasant, either.

YouTube also announced recently

that they’re going to release
a version of their kids' app

that would be entirely
moderated by humans.

Facebook – Zuckerberg said
much the same thing at Congress,

when pressed about how they
were going to moderate their stuff.

He said they’d have humans doing it.

And what that really means is,

instead of having toddlers being
the first person to see this stuff,

you’re going to have underpaid,
precarious contract workers

without proper mental health support

being damaged by it as well.

(Laughter)

And I think we can all do
quite a lot better than that.

(Applause)

The thought, I think, that brings those
two things together, really, for me,

is agency.

It’s like, how much do we really
understand – by agency, I mean:

how we know how to act
in our own best interests.

Which – it’s almost impossible to do

in these systems that we don’t
really fully understand.

Inequality of power
always leads to violence.

And we can see inside these systems

that inequality of understanding
does the same thing.

If there’s one thing that we can do
to start to improve these systems,

it’s to make them more legible
to the people who use them,

so that all of us have
a common understanding

of what’s actually going on here.

The thing, though, I think
most about these systems

is that this isn’t, as I hope
I’ve explained, really about YouTube.

It’s about everything.

These issues of accountability and agency,

of opacity and complexity,

of the violence and exploitation
that inherently results

from the concentration
of power in a few hands –

these are much, much larger issues.

And they’re issues not just of YouTube
and not just of technology in general,

and they’re not even new.

They’ve been with us for ages.

But we finally built this system,
this global system, the internet,

that’s actually showing them to us
in this extraordinary way,

making them undeniable.

Technology has this extraordinary capacity

to both instantiate and continue

all of our most extraordinary,
often hidden desires and biases

and encoding them into the world,

but it also writes them down
so that we can see them,

so that we can’t pretend
they don’t exist anymore.

We need to stop thinking about technology
as a solution to all of our problems,

but think of it as a guide
to what those problems actually are,

so we can start thinking
about them properly

and start to address them.

Thank you very much.

(Applause)

Thank you.

(Applause)

Helen Walters: James, thank you
for coming and giving us that talk.

So it’s interesting:

when you think about the films where
the robotic overlords take over,

it’s all a bit more glamorous
than what you’re describing.

But I wonder – in those films,
you have the resistance mounting.

Is there a resistance mounting
towards this stuff?

Do you see any positive signs,
green shoots of resistance?

James Bridle: I don’t know
about direct resistance,

because I think this stuff
is super long-term.

I think it’s baked into culture
in really deep ways.

A friend of mine,
Eleanor Saitta, always says

that any technological problems
of sufficient scale and scope

are political problems first of all.

So all of these things we’re working
to address within this

are not going to be addressed
just by building the technology better,

but actually by changing the society
that’s producing these technologies.

So no, right now, I think we’ve got
a hell of a long way to go.

But as I said, I think by unpacking them,

by explaining them, by talking
about them super honestly,

we can actually start
to at least begin that process.

HW: And so when you talk about
legibility and digital literacy,

I find it difficult to imagine

that we need to place the burden
of digital literacy on users themselves.

But whose responsibility
is education in this new world?

JB: Again, I think this responsibility
is kind of up to all of us,

that everything we do,
everything we build, everything we make,

needs to be made
in a consensual discussion

with everyone who’s avoiding it;

that we’re not building systems
intended to trick and surprise people

into doing the right thing,

but that they’re actually involved
in every step in educating them,

because each of these systems
is educational.

That’s what I’m hopeful about,
about even this really grim stuff,

that if you can take it
and look at it properly,

it’s actually in itself
a piece of education

that allows you to start seeing
how complex systems come together and work

and maybe be able to apply
that knowledge elsewhere in the world.

HW: James, it’s such
an important discussion,

and I know many people here
are really open and prepared to have it,

so thanks for starting off our morning.

JB: Thanks very much. Cheers.

(Applause)

我是詹姆斯。

我是一名作家和艺术家

,我的作品与技术有关。

我做一些事情,比如

在世界各地的城市街道上画出真人大小的军用无人机轮廓,

这样人们就可以开始思考
并了解

这些非常难以看到
和难以思考的技术。

我制作了诸如根据天气预报预测选举结果的神经网络之类的东西

因为我

对这些奇怪的新技术的实际可能性很感兴趣。

去年,我建造
了自己的自动驾驶汽车。

但是因为我不是
很信任技术,

所以我也为它设计了一个陷阱。

(笑声)

我做这些事情主要是因为
我觉得它们非常迷人,

但也因为我认为
当我们谈论技术时,

我们主要是在谈论我们自己

以及我们理解世界的方式。

所以这里有一个关于技术的故事。

这是一个“惊喜蛋”视频。

这基本上是一个视频,有人
打开大量的巧克力蛋

并向观众展示里面的玩具。

而已。 这就是它在
长达七分钟的时间里所做的一切。

我想让你注意
两件事。

首先,这个视频
有3000万的浏览量。

(笑声

) 另一件事是,

它来自一个
拥有 630 万订阅者的频道

, 总共有 80 亿的观看次数,

而且只是更多这样的视频——

3000 万人观看一个人
打开这些鸡蛋。

这听起来很奇怪,但如果你
在 YouTube 上搜索“惊喜蛋”,

它会告诉你有
1000 万个这样的视频

,我认为这是一个低估的数字。

我认为有办法,还有更多。

如果你继续寻找,它们是无穷无尽的。

有数以百万计
的这些视频


品牌和材料的巴洛克组合中越来越多,

而且每天都有越来越多的视频
被上传。

就像,这是一个陌生的世界。 对?

但问题是,
观看这些视频的不是成年人。

是孩子,小孩子。

这些视频
对于小孩子来说就像破解。

有一些关于重复的东西,揭示

的持续的小
多巴胺冲击

,完全把他们吸引住了

。小孩子一遍又一遍地看这些视频

,他们这样做了几个小时
、几个小时、几个小时。

如果你试图
把屏幕从他们身上拿开,

他们会尖叫、尖叫、尖叫。

如果你不相信我

——我已经看到
观众中有人点头——

如果你不相信我,找一个
有小孩的人问他们

,他们就会
知道惊喜彩蛋的视频。

所以这就是我们开始的地方。

现在是 2018 年,有人或很多人

正在使用相同的机制,比如
Facebook 和 Instagram

用来让你继续检查该应用程序

,他们正在 YouTube 上使用它
来破解非常小的孩子的大脑

以换取广告收入。

至少,我希望
这就是他们正在做的事情。

我希望这就是他们这样做的目的,

因为有更简单的方法可以
在 YouTube 上获得广告收入。

你可以编造东西或偷东西。

因此,如果您搜索

“小猪佩奇”或“爪子巡逻队”等非常受欢迎的儿童卡通片,

您会发现网上也有数以百万计
的此类动画片。

当然,它们中的大多数都不是
由原始内容创建者发布的。

它们来自
大量不同的随机帐户

,不可能知道是
谁在发布它们

或它们的动机是什么。

这听起来是不是有点耳熟?

因为它

与我们大多数数字服务中

发生的机制完全相同,不可能
知道这些信息来自哪里。

对于孩子来说,这基本上是假新闻,

我们从出生开始就训练他们

点击出现的第一个链接

无论来源是什么。

这似乎不是
一个非常好的主意。

这是
儿童 YouTube 上另一件非常重要的事情。

这就是所谓的“指家之歌”。

我刚刚听到有人
在观众席上呻吟。

这就是“指家之歌”。

这是我能找到的第一个。

从 2007 年开始,它只有
200,000 次观看

,这在这个游戏中是没有的。

但它有这种疯狂的耳蜗般的曲调

,我不会给你演奏,

因为它会像
在我的大脑

中灼烧
一样灼烧自己

,我不会对你这样做。

但就像惊喜蛋一样,

它进入了孩子们的脑海

并让他们上瘾。

因此,在几年之内,
这些手指家庭视频

开始出现在任何地方,

并且您会
获得不同语言的版本,

其中包含使用食物的流行儿童卡通片,

或者坦率地说,使用

您似乎拥有的任何类型的动画元素。

再一次,

在所有
这些疯狂的组合中,在线提供了数以百万计的这些视频。

你开始和他们在一起的时间越多

,你就会开始觉得自己可能变得越来越疯狂。

这就是
我开始进入这个的地方,

那种深深的陌生感

对这东西是如何构造的深深缺乏理解的
感觉似乎呈现在我周围。

因为不可能
知道这些东西是从哪里来的。

比如,谁在制造它们?

其中一些似乎是
由专业动画师团队组成的。

其中一些只是
由软件随机组装而成。

他们中的一些人看起来很健康,是
小孩子的艺人。

其中一些来自

那些非常明显
不应该在孩子身边的人。

(笑声)

再一次,
不可能弄清楚是谁在制造这些东西——

就像,这是一个机器人?

这是一个人吗? 这是一个巨魔吗?

我们

再也分不清这些东西之间的区别是什么意思?

再说一次,这种不确定性现在是不是
有点熟悉?

因此,人们获得视频观看次数的主要方式

——请记住,观看次数意味着金钱

——他们在
这些视频的标题中填充了这些流行的术语。

所以你拿,比如,“惊喜蛋”

,然后你添加
“爪子巡逻队”,“复活节彩蛋”

或者其他任何东西,

所有这些来自其他
流行视频的词到你的标题中,

直到你最终得到这种
毫无意义的语言混搭,

对人类根本没有意义。

因为当然只有
很小的孩子在看你的视频

,他们到底知道什么?

这些东西的真正受众是软件。

是算法。

YouTube 使用该软件

来选择哪些
视频与其他视频相似,

以使其受欢迎,
以使它们得到推荐。

这就是为什么你最终会得到
这种完全没有意义的混搭,

无论是标题还是内容。

但问题是,你必须记住,


这个算法优化的系统中确实还有一些人,

他们越来越被迫表现出

这些越来越奇怪
的单词组合,

就像一个绝望的即兴表演艺术家
回应一个组合的尖叫声

百万幼儿一次。

有真实的人
被困在这些系统中

,这是算法驱动文化的另一个非常奇怪的
事情,

因为即使你是人类,

你最终也必须表现得像机器

一样才能生存。

而且,在屏幕的另一边,

仍然有这些小孩
看着这些东西,

卡住了,他们的全部注意力都
被这些奇怪的机制所吸引。

而且这些孩子中的大多数都太小
了,甚至无法使用网站。

他们只是
用他们的小手敲击屏幕。

所以有自动播放

,它只是
在循环中一遍又一遍地播放这些视频,一次

无休止地播放几个小时。

现在系统中有很多奇怪之处

,自动播放会将您
带到一些非常奇怪的地方。

这就是如何在十几个步骤内

,从一段可爱
的计数火车视频

变成手淫米老鼠的方法。

是的。 对此我很抱歉。

这确实会变得更糟。

这就是

当所有这些不同的关键词、

所有这些不同的关注点

、绝望的内容生

成都聚集在一个地方时会发生的事情。

这就是所有那些非常奇怪的
关键字归巢的地方。

你将手指家庭视频

与一些真人超级英雄的东西混合在一起,

你添加了一些奇怪的、
恶作剧的笑话或其他东西

,突然间,你
确实来到了一个非常奇怪的地方。

往往会让父母心烦意乱

的东西是含有暴力
或色情内容的东西,对吧?

儿童卡通片被殴打,

被杀,

奇怪的恶作剧,实际上
真的吓坏了孩子们。

你所拥有的是软件吸收
所有这些不同的影响

来自动生成
孩子们最糟糕的噩梦。

而这些东西真的,真的
会影响小孩子。

父母报告说他们的孩子
受到了创伤,

害怕黑暗,

害怕他们最喜欢的
卡通人物。

如果您对此
有所了解,那就是如果您有小孩,

请让他们远离 YouTube。

(掌声)

但另一件事
,真正让我明白的

是,我不确定我们是否真的
理解我们是如何走到这一步的。

我们吸收了所有这些影响,
所有这些东西,

并以一种没有人真正想要的方式将它们组合在一起

然而,这也是
我们构建整个世界的方式。

我们正在收集所有这些数据

,很多是坏数据

,很多充满偏见的历史数据

充满了我们所有最糟糕
的历史冲动

,我们正在将它们构建
成巨大的数据集

,然后我们 自动化它。

我们将其整合
到信用报告

、保险费

、预测性
警务系统

和量刑指南等内容中。

这就是我们用这些数据实际
构建当今世界的方式

而且我不知道更糟糕的是

,我们构建的系统
似乎完全

针对人类行为的绝对最坏方面进行了优化

或者我们
似乎是偶然做到的,

甚至没有
意识到我们正在这样做,

因为 我们并不真正了解
我们正在构建的系统

,我们也不真正了解
如何用它做任何不同的事情。

我认为有几件事
似乎真的在 YouTube 上最充分地推动了这一点

其中第一个是广告,

这是在

没有任何实际其他变量的情况下将注意力货币化,

对实际正在发展的人的任何关心
这个内容

,集权
,分离那些东西。

而且我认为无论您
如何看待使用广告

来支持某种东西

,看到成年男子穿着尿布
在沙滩上打滚


希望他们并不真正理解的算法

能给他们带来金钱,这

表明 这
可能不是

我们应该建立
社会和文化的基础,

也不是我们应该资助它的方式。

另一个主要驱动因素是自动化,


所有这些

技术一到就部署,
没有任何监督,

然后一旦出现,

有点举起手来, 去,
“嘿,这不是我们,这是技术。”

比如,“我们不参与其中。”

这还不够好,

因为这些东西
不仅受算法控制,

还受算法监管。

当 YouTube 刚
开始关注这一点时,

他们表示要做的第一件事就是

部署
更好的机器学习算法

来调整内容。

嗯,机器学习,
正如它的任何专家都会告诉你的那样,

基本上是我们开始称之为

软件的东西,我们并不真正
了解它是如何工作的。

我认为我们已经受
够了。

我们不应该把
这些东西留给人工智能来决定

什么是合适的,

因为我们知道会发生什么。

它会开始审查其他事情。

它将开始审查酷儿内容。

它将开始审查
合法的公开言论。

在这些话语中允许的

事情,不应该是
由不负责任的系统决定的事情。


是我们所有人都应该进行的讨论的一部分。

但我要提醒

一下,替代方案也不是
很愉快。

YouTube 最近还

宣布,他们将发布
一个完全由人类管理的儿童应用程序版本

Facebook——扎克伯格

国会被问及
他们将如何缓和他们的东西时说了很多同样的话。

他说他们会让人类来做。

这真正意味着,

不是让蹒跚学步的孩子
成为第一个看到这些东西的人,

而是你将拥有工资过低、
不稳定的合同工,

而没有适当的心理健康支持

也会受到损害。

(笑声)

而且我认为我们都可以做得
比这更好。

(掌声

) 我认为,将
这两件事结合在一起的想法,对我来说,真的

是能动性。

就像,我们真正了解多少
——我的意思是代理机构:

我们如何知道如何为
自己的最大利益行事。

在这些我们并不完全理解的系统中几乎不可能做到这一点

权力不平等
总是导致暴力。

我们可以在这些系统

中看到,理解的不平等
也有同样的作用。

如果我们可以做一件事
来开始改进这些系统,

那就是让它们
对使用它们的人来说更易读,

这样我们所有人就

可以对这里实际发生的事情有一个共同的理解。

不过,我认为
这些系统最重要

的一点是,正如我希望
我已经解释的那样,这与 YouTube 无关。

这是关于一切的。

这些责任和代理问题

、不透明性和复杂性、

由于权力集中在少数人手中而固有的暴力和剥削

问题——这些问题要大得多。

它们不仅是 YouTube 的问题
,也不仅仅是一般的技术问题,

而且它们甚至都不是新事物。

他们已经和我们在一起多年了。

但是我们最终建立了这个系统,
这个全球系统,互联网,

它实际上
以这种非同寻常的方式向我们展示了它们,

使它们不可否认。

技术具有这种非凡的能力

,可以实例化和

延续我们所有最非凡的,
通常是隐藏的欲望和偏见

,并将它们编码到世界中,

但它也将它们写下来,
以便我们可以看到它们,

这样我们就不能假装
它们没有 已经不存在了。

我们需要停止将技术
视为解决我们所有问题的方法,

而是将其视为
这些问题实际情况的指南,

这样我们就可以开始
正确地思考它们

并开始解决它们。

非常感谢你。

(掌声)

谢谢。

(掌声)

Helen Walters:James,谢谢你
来给我们演讲。

所以这很有趣:

当你想到
机器人霸主接管的电影时,

这一切
都比你所描述的更迷人。

但我想知道——在那些电影中,
你的阻力越来越大。

对这些东西有抵抗力吗?

您是否看到任何积极的迹象,
抵抗的萌芽?

James Bridle:我不
知道直接阻力,

因为我认为这东西
是超级长期的。

我认为它
以非常深刻的方式融入了文化。

我的一个朋友
Eleanor Saitta 总是说

,任何
规模和范围足够大的技术

问题首先都是政治问题。

因此,我们正在
努力解决的所有这些

问题不会
仅仅通过更好地构建技术来解决,

而是通过改变
生产这些技术的社会来解决。

所以不,现在,我认为我们还有
很长的路要走。

但正如我所说,我认为通过解开它们,

通过解释它们,通过
超级诚实地谈论它们,

我们实际上
至少可以开始这个过程。

HW:所以当你谈到
易读性和数字素养时,

我发现很难

想象我们需要将
数字素养的负担放在用户自己身上。


在这个新世界里,教育是谁的责任?

JB:再一次,我认为这个
责任在某种程度上取决于我们所有人,

我们所做的每一件事,
我们建造的每一件事,我们所做的每一件事,都

需要

与所有回避它的人协商一致;

我们不是在构建
旨在欺骗和惊喜

人们做正确事情的系统,

而是他们实际上参与
了教育他们的每一步,

因为这些系统中的每一个
都是教育性的。

这就是我所希望的
,即使是这个非常严峻的东西

,如果你能接受
并正确看待它,

它本身实际上就是
一种教育

,让你开始
了解复杂的系统是如何组合在一起和工作的

,也许 能够将
这些知识应用到世界其他地方。

HW:詹姆斯,这是
一个非常重要的讨论

,我知道这里的很多人
都非常开放并准备好接受它,

所以感谢您开始我们的早晨。

JB:非常感谢。 干杯。

(掌声)