Nadya Bartol Better cybersecurity starts with honesty and accountability TED

Transcriber:

Today, I’m going to talk about
a shameful topic.

This has happened to many of us,
and it’s embarrassing,

but if we don’t talk about it,
nothing will ever change.

It’s about being hacked.

Some of us have clicked on a phishing link
and downloaded a computer virus.

Some of us have had our identities stolen.

And those of us
who are software developers

might have written insecure code
with security bugs in it

without realizing it.

As a cybersecurity expert,

I have worked with countless companies
on improving their cybersecurity.

Cybersecurity experts like me
have advised companies

on good cybersecurity practices,

monitoring tools

and proper user behaviors.

But I actually see a much bigger problem
that no tool can fix:

the shame associated with
the mistakes that we make.

We like to think of ourselves
as competent and tech savvy,

and when we make these mistakes
that can have a really bad impact

on us and our companies –

anything from a simple annoyance,

to taking a lot of time to fix,

to costing us and our employers
a lot of money.

Despite billions of dollars
that companies spend on cybersecurity,

practitioners like me see the same
problems over and over again.

Let me give you some examples.

The 2015 hack of Ukrainian utilities

that disconnected power
for 225,000 customers

and took months to restore
back to full operations

started with a phishing link.

By the way, 225,000 customers
is a lot more 225,000 people.

Customers can be anything
from an apartment building

to an industrial facility

to a shopping mall.

The 2017 data breach of Equifax

that exposed personally
identifiable information

of 140 million people

and may ultimately cost Equifax something
on the order of 1.4 billion dollars:

that was caused by an exploitation
of a well-known vulnerability

in the company’s customer
consumer complaint portal.

Fundamentally, this is about
technology and innovation.

Innovation is good;
it makes our lives better.

Most of the modern cars we drive today
are fundamentally computers on wheels.

They tell us where to go to avoid traffic,
when to take them in for maintenance

and then give us all kinds of
modern-day conveniences.

Many people use connected
medical devices like pacemakers

and glucose monitors with insulin pumps.

These devices make
these people’s lives better

and sometimes even extend their lives.

But anything that can be interconnected
can be hacked when it’s connected.

Did you know that the former
US Vice President Dick Cheney

kept his pacemaker disconnected from Wi-Fi
before he received a heart transplant?

I will let you figure out why.

In a digitally interconnected world,
cyber risks are literally everywhere.

For years, my colleagues and I
have been talking about

this elusive notion
of cybersecurity culture.

Cybersecurity culture is when
everybody in the organization

believes that cybersecurity is their job,

knows what to do and what not to do

and does the right thing.

Unfortunately, I can’t tell you
which companies do this well,

because by doing so, I would put
a juicy target on their backs

for ambitious attackers.

But what I can do is make
cybersecurity less mysterious,

bring it out into the open
and talk about it.

There should be no mystery or secrecy
within an organization.

When something is invisible
and it’s working,

we don’t know that it’s there
until it’s not there.

Kind of like toilet paper.

When the COVID-19 pandemic began,

what has been there all of a sudden
became super important

because we couldn’t find it anywhere.

Cybersecurity is just like that:

when it’s working, we don’t know,
and we don’t care.

But when it’s not working,

it can be really, really bad.

Toilet paper is pretty straightforward.

Cybersecurity is mysterious and complex.

And I actually think it starts
with the notion of psychological safety.

This notion was popularized
by an organizational behavior scientist,

Amy Edmondson.

Amy studied behavior of medical teams
in high-stakes situations like hospitals,

where mistakes could be fatal.

And she found out
that nurses were not comfortable

bringing up suggestions to the doctors

because of the fear
of questioning authority.

Amy helped improve medical teams

to make nurses more comfortable
bringing up suggestions to the doctors

for patient treatment

without the fear of being
scolded or demeaned.

For that to happen, doctors needed
to listen and be receptive –

without judging.

Psychological safety is when everybody
is comfortable speaking up

and pointing things out.

I want cybersecurity to be the same.

And I want cybersecurity practitioners
to be comfortable bringing suggestions up

to senior executives
or software developers,

without being dismissed as those people
who continue to talk about

horrors and errors,

and say no.

Not doing so is really hard

for the individuals who are responsible
for the creation of digital products

because fundamentally, it’s about
their pride and joy in their creations.

I once tried talking to a senior
software development executive

about the need to do better security.

You know what he said?

“Are you telling me
we’re developing insecure code?”

In other words, what he heard
was, “Your baby is ugly.”

What if instead of focusing on
what not to do,

we focused on what to do?

Like, how do we develop better software

and protect our customer
information at the same time?

Or how do we make sure
that our organization is able to operate

in crisis, under attack
or in an emergency?

And what if we reward good things that
people do in cybersecurity in some way

and encourage them to do so,

like reporting security incidents,

reporting potential phishing emails,

or finding and fixing
software security bugs

in the software that they develop?

And what if we tied these good security
actions to performance evaluations

to make it really matter?

I would love for us to communicate
these good cybersecurity things

and encourage them in some sort of
company-wide communications

like newsletters, blogs,
websites, microsites –

whatever we use to communicate
to our organization.

What if a company announced a competition
for who finds the most security bugs

and fixes them in a two-week
development sprint

and then announces the winner
of the competition for the quarter

at a large company virtual town hall,

and then rewards these people,
these winners, with something meaningful,

like a week’s vacation or a bonus.

Others will see
the celebration and recognition,

and they’ll want to do the same.

In the energy industry,

there is a really strong
culture of safety.

People care about this culture,
are proud of it,

and there is a collective
reinforcement of this culture

to make sure that nobody gets hurt.

One of the ways they exhibit and keep
this safety conscious culture going

is by counting and visibly displaying days
since the last safety incident.

And then everybody works really hard
not to have that count go back to zero

because that means
that somebody did get hurt.

Cybersecurity is the same as safety.

What if we all agree

to keep that count of days
since the last cybersecurity incident

going on forever

and then work really hard
not to have it reset to zero?

And then certain things are a no-no,

and we need to clearly communicate
to our organizations what they are

in an easily digestible
and maybe even fun way,

like gamification or simulations,

to make sure that people
can remember this.

And if somebody does something
they’re not supposed to do,

they should face some sort
of consequences.

So, for example, if an employee buys
equipment on Amazon or eBay

or uses personal Dropbox
for their company business,

then they should face
some sort of consequences.

And when this happens, executives
should get the same treatment

as regular employees,

because if they don’t,
then people won’t believe that it’s real

and will go back to their old behaviors.

It’s OK to talk about mistakes,

but just like a teenager who violates
the rules tells us about it,

we appreciate that they told us about it,

but there should still be
some sort of consequences.

Cybersecurity is a journey.

It’s not a destination,

and we need to keep working on it.

I would love for us to celebrate
cybersecurity people

like the heroes that they are.

If we think about it,
they are firefighters,

emergency room doctors and nurses,

law enforcement, risk executives
and business strategists

all in the same persona.

And they help us protect our modern life
that we like so much.

They protect our identities,
our inventions, our intellectual property,

our electric grid, medical devices,

connected cars and myriad other things.

And I’d like to be on that team.

So let’s agree that this thing
is with us to stay,

let’s create a safe environment
to learn from our mistakes,

and let’s commit to making things better.

Thank you.

抄写员:

今天,我要讲
一个可耻的话题。

这发生在我们很多人身上
,很尴尬,

但如果我们不谈论它,
什么都不会改变。

这是关于被黑客入侵的。

我们中的一些人点击了网络钓鱼链接
并下载了计算机病毒。

我们中的一些人的身份被盗。

我们这些软件开发人员

可能在没有意识到的情况下编写了
包含安全漏洞的不安全代码

作为一名网络安全专家,

我曾与无数公司
合作改善他们的网络安全。

像我这样的网络安全专家

就良好的网络安全实践、

监控工具

和适当的用户行为向公司提供建议。

但我实际上看到了一个更大的问题
,没有任何工具可以解决:


我们所犯错误相关的耻辱。

我们喜欢认为自己
是有能力和精通技术的

,当我们犯下
这些可能

对我们和我们的公司产生非常糟糕影响的错误时——

从简单的烦恼

到花费大量时间来修复,

再到让我们付出代价 和我们的
雇主很多钱。

尽管
公司在网络安全上花费了数十亿美元,但

像我这样的从业者
一遍又一遍地看到同样的问题。

让我给你一些例子。

2015 年对乌克兰公用事业公司的黑客攻击导致

225,000 名客户断电,

并花费数月时间
恢复全面运营,事件

始于网络钓鱼链接。

顺便说一句,225,000 名
客户比 225,000 人多得多。

客户可以是任何东西,
从公寓楼

到工业设施

再到购物中心。

2017 年 Equifax 的数据泄露事件

暴露

了 1.4 亿人的个人身份信息,

最终可能使 Equifax 损失
约 14 亿美元:

这是由于利用

该公司客户
消费者投诉门户中的一个众所周知的漏洞造成的。

从根本上说,这与
技术和创新有关。

创新是好的;
它让我们的生活更美好。

我们今天驾驶的大多数现代汽车
基本上都是带轮子的计算机。

他们告诉我们去哪里可以避开交通,
何时将它们送去维修

,然后为我们提供各种
现代便利。

许多人使用连接的
医疗设备,例如

带胰岛素泵的起搏器和血糖监测仪。

这些设备使
这些人的生活变得更好

,有时甚至延长了他们的寿命。

但是任何可以互连的东西都可能在
连接时被黑客入侵。

您是否知道美国前
副总统迪克切尼在接受心脏移植手术之前

将他的起搏器与 Wi-Fi 断开连接

我会让你弄清楚为什么。

在数字互联的世界中,
网络风险无处不在。

多年来,我和我的同事
一直在谈论

这种难以捉摸
的网络安全文化概念。

网络安全文化是
指组织中的每个人都

相信网络安全是他们的工作,

知道该做什么,不该做什么,

并且做正确的事。

不幸的是,我不能告诉你
哪些公司做得很好,

因为这样做,我会

为雄心勃勃的攻击者背上一个多汁的目标。

但我能做的是让
网络安全变得不那么神秘,

把它公开
并谈论它。 组织内部

不应该有任何神秘或秘密

当某些东西是不可见的
并且它正在工作时,

我们不知道它存在,
直到它不存在。

有点像卫生纸。

当 COVID-19 大流行开始时,

那里的东西突然
变得非常重要,

因为我们在任何地方都找不到它。

网络安全就是这样:

它什么时候起作用,我们不知道,
也不关心。

但是当它不起作用时,

它可能非常非常糟糕。

卫生纸很简单。

网络安全是神秘而复杂的。

我实际上认为它
始于心理安全的概念。

这个概念
由组织行为科学家

艾米·埃德蒙森(Amy Edmondson)推广。

艾米研究了医疗团队
在医院等高风险情况下的行为,在这种情况

下错误可能是致命的。


发现护士不愿意

向医生提出建议,

因为
害怕质疑权威。

艾米帮助改善了医疗团队

,让护士更
愿意向医生提出患者治疗建议,

而不必担心被
责骂或贬低。

为此,医生
需要倾听并接受——

无需评判。

心理安全是当每个人
都可以自在地畅所欲言

并指出事情时。

我希望网络安全也一样。

而且我希望网络安全从业
者能够轻松地

向高级管理人员
或软件开发人员提出建议,

而不是被视为
那些继续谈论

恐怖和错误

并说不的人。

对于负责创建数字产品的个人来说,不这样做真的很难,

因为从根本上说,这关乎
他们对自己创作的自豪和喜悦。

我曾经尝试与一位高级
软件开发主管

讨论提高安全性的必要性。

你知道他说了什么吗?

“你是在告诉我
我们正在开发不安全的代码?”

换句话说,他听到的
是,“你的孩子很丑。”

如果我们
不专注于不做什么,

而是专注于做什么呢?

比如,我们如何开发更好的软件

并同时保护我们的客户
信息?

或者我们如何
确保我们的组织能够

在危机、攻击
或紧急情况下运作?

如果我们
以某种方式奖励人们在网络安全领域所做的好事

并鼓励他们这样做,

例如报告安全事件、

报告潜在的网络钓鱼电子邮件

在他们开发的软件中发现和修复软件安全漏洞,那会怎样?

如果我们将这些良好的
安全措施与性能评估联系

起来,让它变得真正重要呢?

我希望我们能够传达
这些良好的网络安全信息

,并在某种
公司范围内的通讯中鼓励它们,

例如时事通讯、博客、
网站、微型网站——

无论我们使用什么来
与我们的组织进行交流。

如果一家公司在为期两周的开发冲刺中宣布
了谁找到最多安全漏洞

并修复它们

的竞赛,然后

在大型公司虚拟市政厅宣布该季度竞赛的获胜者,

然后奖励这些人,
这些 赢家,有一些有意义的东西,

比如一周的假期或奖金。

其他人会
看到庆祝和认可

,他们也会想要这样做。

在能源行业,

有一种非常强大
的安全文化。

人们关心这种文化,
并以此为荣,这种

文化得到了集体
强化,

以确保没有人受到伤害。

他们展示和保持
这种安全意识文化的方式之一

是计算和明显展示
自上次安全事件以来的天数。

然后每个人都非常努力地
不让这个计数回到零,

因为这
意味着有人确实受伤了。

网络安全与安全相同。

如果我们都同意

永远保持
自上次网络安全事件以来的天数

,然后努力
不让它重置为零怎么办?

然后某些事情是禁止的

,我们需要以一种易于
理解甚至有趣的方式向我们的组织清楚地传达它们是什么

例如游戏化或模拟,

以确保人们
能够记住这一点。

如果有人做了
他们不应该做的事情,

他们应该面临
某种后果。

因此,例如,如果员工
在 Amazon 或 eBay 上购买设备

或使用个人 Dropbox
进行公司业务,

那么他们应该面临
某种后果。

而当这种情况发生时,高管们
应该得到与普通员工相同的待遇

因为如果他们不这样做,
那么人们就不会相信这是真的,

并且会回到他们的旧行为中。

谈论错误是可以的,

但就像一个违反规则的少年
告诉我们一样,

我们感谢他们告诉我们这件事,

但仍然应该
有某种后果。

网络安全是一段旅程。

这不是一个目的地

,我们需要继续努力。

我希望我们能够

像英雄一样庆祝网络安全人士。

如果我们仔细想想,
他们是消防员、

急诊室医生和护士、

执法人员、风险管理人员
和商业战略家,

都是同一个角色。

它们帮助我们保护
我们非常喜欢的现代生活。

它们保护我们的身份、
我们的发明、我们的知识产权、

我们的电网、医疗设备、

联网汽车和无数其他事物。

我想加入那个团队。

因此,让我们同意这
件事与我们同在,

让我们创造一个安全的环境
来从我们的错误中吸取教训

,让我们致力于让事情变得更好。

谢谢你。