A glimpse of the future through an augmented reality headset Meron Gribetz

Today’s computers are so amazing

that we fail to notice
how terrible they really are.

I’d like to talk to you today
about this problem,

and how we can fix it with neuroscience.

First, I’d like to take you back
to a frosty night in Harlem in 2011

that had a profound impact on me.

I was sitting in a dive bar
outside of Columbia University,

where I studied computer science
and neuroscience,

and I was having this great conversation
with a fellow student

about the power of holograms
to one day replace computers.

And just as we were getting
to the best part of the conversation,

of course, his phone lights up.

And he pulls it towards himself,
and he looks down and he starts typing.

And then he forces his eyeballs
back up to mine and he goes,

“Keep going. I’m with you.”

But of course his eyes were glazed over,

and the moment was dead.

Meanwhile across the bar,

I noticed another student
holding his phone,

this time towards a group.

He was swiping through
pictures on Instagram,

and these kids were laughing hysterically.

And that dichotomy
between how crappy I was feeling

and how happy they were feeling
about the same technology,

really got me thinking.

And the more I thought of it,
the more I realized

it was clearly not the digital information
that was the bad guy here,

it was simply the display position
that was separating me from my friend

and that was binding those kids together.

See, they were connected around something,

just like our ancestors
who evolved their social cognitions

telling stories around the campfire.

And that’s exactly what tools
should do, I think.

They should extend our bodies.

And I think computers today
are doing quite the opposite.

Whether you’re sending
an email to your wife

or you’re composing a symphony

or just consoling a friend,

you’re doing it in pretty
much the same way.

You’re hunched over these rectangles,

fumbling with buttons and menus
and more rectangles.

And I think this is the wrong way,

I think we can start using
a much more natural machine.

We should use machines that bring
our work back into the world.

We should use machines that use
the principles of neuroscience

to extend our senses
versus going against them.

Now it just so happens
that I have such a machine here.

It’s called the Meta 2.

Let’s try it out.

Now in front of me right now,
I can see the audience,

and I can see my very hands.

And in three, two, one,

we’re going to see an immersive
hologram appear,

a very realistic hologram
appear in front of me,

of our very glasses I’m wearing
on my head right now.

And of course this could be
anything that we’re shopping for

or learning from,

and I can use my hands

to very nicely kind of move
it around with fine control.

And I think Iron Man would be proud.

We’re going to come back
to this in just a bit.

(Applause)

Now if you’re anything like me,
your mind is already reeling

with the possibilities of what we can do
with this kind of technology,

so let’s look at a few.

My mom is an architect,

so naturally the first thing I imagined

was laying out a building in 3D space

instead of having to use
these 2D floor plans.

She’s actually touching graphics right now

and selecting an interior decor.

This was all shot through a GoPro
through our very glasses.

And this next use case
is very personal to me,

it’s Professor Adam Gazzaley’s
glass brain project,

courtesy of UCSF.

As a neuroscience student,

I would always fantasize

about the ability to learn and memorize
these complex brain structures

with an actual machine,

where I could touch and play
with the various brain structures.

Now what you’re seeing
is called augmented reality,

but to me, it’s part
of a much more important story –

a story of how we can begin
to extend our bodies with digital devices,

instead of the other way around.

Now …

in the next few years, humanity’s
going to go through a shift, I think.

We’re going to start putting
an entire layer of digital information

on the real world.

Just imagine for a moment

what this could mean for storytellers,

for painters,

for brain surgeons,

for interior decorators

and maybe for all of us here today.

And what I think we need
to do as a community,

is really try and make an effort

to imagine how we can
create this new reality

in a way that extends
the human experience,

instead of gamifying our reality

or cluttering it with digital information.

And that’s what I’m very passionate about.

Now, I want to tell you a little secret.

In about five years –

this is not the smallest device –

in about five years,

these are all going to look like
strips of glass on our eyes

that project holograms.

And just like we don’t care so much
about which phone we buy

in terms of the hardware – we buy it
for the operating system –

as a neuroscientist,

I always dreamt of building
the iOS of the mind, if you will.

And it’s very, very important
that we get this right,

because we might be living
inside of these things

for at least as long as we’ve lived

with the Windows graphical user interface.

And I don’t know about you,

but living inside of Windows scares me.

(Laughter)

To isolate the single most intuitive
interface out of infinity,

we use neuroscience to drive
our design guidelines,

instead of letting a bunch of designers
fight it out in the boardroom.

And the principle we all revolve around

is what’s called the “Neural Path
of Least Resistance.”

At every turn, we’re connecting
the iOS of the brain with our brain

on, for the first time, our brain’s terms.

In other words, we’re trying to create
a zero learning-curve computer.

We’re building a system
that you’ve always known how to use.

Here are the first three
design guidelines that we employ

in this brand-new form of user experience.

First and foremost,
you are the operating system.

Traditional file systems
are complex and abstract,

and they take your brain
extra steps to decode them.

We’re going against the Neural Path
of Least Resistance.

Meanwhile, in augmented reality,

you can of course place
your holographic TED panel over here,

and your holographic email
on the other side of the desk,

and your spatial memory evolved just fine
to go ahead and retrieve them.

You could put your holographic Tesla
that you’re shopping for –

or whatever model my legal team
told me to put in right before the show.

(Laughter)

Perfect. And your brain knows
exactly how to get it back.

The second interface guideline
we call “touch to see.”

What do babies do when they see
something that grabs their interest?

They try and reach out and touch it.

And that’s exactly how the natural
machine should work as well.

Turns out the visual system
gets a fundamental boost

from a sense we call proprioception –

that’s the sense
of our body parts in space.

So by touching our work directly,
we’re not only going to control it better,

we’re also going to understand
it much more deeply.

Hence, touch to see.

But it’s not enough
to experience things ourselves.

We’re inherently these social primates.

And this leads me to our third guideline,

the holographic campfire
from our first story.

Our mirror-neuron subsystem suggests

that we can connect with each other
and with our work much better

if we can see each other’s
faces and hands in 3D.

So if you look at the video behind me,

you can see two Meta users
playing around with the same hologram,

making eye contact,
connected around this thing,

instead of being distracted
by external devices.

Let’s go ahead and try this again
with neuroscience in mind.

So again, our favorite interface,
the iOS of the mind.

I’m going to now take a step further

and go ahead and grab this pair of glasses

and leave it right here by the desk.

I’m now with you, I’m in the moment,

we’re connecting.

My spatial memory kicks in,
and I can go ahead and grab it

and bring it right back here, reminding me

that I am the operating system.

And now my proprioception is working,

and I can go ahead and explode
these glasses into a thousand parts

and touch the very sensor
that is currently scanning my hand.

But it’s not enough to see things alone,

so in a second, my co-founder Ray
is going to make a 3D call –

Ray?

(Ringing)

Hey Ray, how’s it going?

Guys, I can see this guy
in front me in full 3D.

And he is photo-realistic.

(Applause)

Thank you.

My mirror-neuron subsystem suggests
that this is going to replace phones

in not too long.

Ray, how’s it going?

Ray: Great. We’re live today.

(Applause)

MG: Ray, give the crowd a gift

of the holographic brain
we saw from the video earlier.

Guys, this is not only
going to change phones,

it’s also going to change
the way we collaborate.

Thank you so much.

Thanks, Ray.

Ray: You’re welcome.

(Applause)

MG: So folks, this is the message
that I discovered in that bar in 2011:

The future of computers is not
locked inside one of these screens.

It’s right here, inside of us.

(Applause)

So if there’s one idea that I could
leave you with here today,

it’s that the natural machine
is not some figment of the future,

it’s right here in 2016.

Which is why all hundred of us at Meta,

including the administrative staff,

the executives,

the designers, the engineers –

before TED2017,

we’re all going to be throwing
away our external monitors

and replacing them with a truly
and profoundly more natural machine.

Thank you very much.

(Applause)

Thank you, appreciate it.

Thanks, guys.

Chris Anderson: So help
me out on one thing,

because there’ve been a few
augmented reality demos

shown over the last year or so out there.

And there’s sometimes
a debate among technologists

about, are we really seeing
the real thing on-screen?

There’s this issue of field of view,

that somehow the technology
is showing a broader view

than you would actually see
wearing the glasses.

Were we seeing the real deal there?

MG: Absolutely the real deal.

Not only that,

we took extra measures to shoot it
with a GoPro through the actual lens

in the various videos
that you’ve seen here.

We want to try to simulate
the experience for the world

that we’re actually seeing
through the glasses,

and not cut any corners.

CA: Thank you so much for showing us that.

MG: Thanks so much, I appreciate that.

今天的计算机是如此惊人

,以至于我们没有注意到
它们到底有多可怕。

今天我想和你
谈谈这个问题,

以及我们如何用神经科学来解决它。

首先,我想带你
回到 2011 年在哈莱姆区的一个寒冷的夜晚,

它对我产生了深远的影响。

我坐在
哥伦比亚大学外面的一家潜水酒吧里,

在那里我学习计算机科学
和神经科学

,我正在
和一位同学进行一场

关于全息图
有朝一日取代计算机的力量的精彩对话。

就在我们
进入谈话的最佳部分时

,当然,他的电话亮了。

他把它拉向自己
,低头开始打字。

然后他强迫他的眼球
回到我的身上,然后他说,

“继续前进。我和你在一起。”

但他的眼睛当然是呆滞的

,那一刻已经死了。

与此同时,

我注意到另一名学生
拿着手机,

这次是面向一群人。

他正在浏览
Instagram 上的照片

,这些孩子正在歇斯底里地大笑。

我对同一技术感到多么糟糕

和他们
对同一技术感到多么高兴之间的二分法,

真的让我思考。

我越想越明白

,这里的坏人显然不是数字信息

它只是把我和我的朋友隔开,把

那些孩子绑在一起的展示位置。

看,他们是围绕着某种东西联系在一起的,

就像我们的
祖先进化了他们的社会认知

,在篝火旁讲故事一样。

我认为这正是工具
应该做的。

他们应该扩展我们的身体。

我认为今天
的计算机正在做相反的事情。

无论您是
给您的妻子发送电子邮件,

还是您正在创作一首交响乐,

或者只是为了安慰一位朋友,

您都在以
几乎相同的方式进行操作。

您对这些矩形弯腰驼背,

摸索按钮和菜单
以及更多矩形。

我认为这是错误的方式,

我认为我们可以开始
使用更自然的机器。

我们应该使用将
我们的工作带回世界的机器。

我们应该使用
使用神经科学原理的机器

来扩展我们的感官
而不是反对它们。

现在
碰巧我在这里有这样一台机器。

它被称为 Meta 2。

让我们尝试一下。

现在在我面前,
我可以看到观众

,我可以看到我的双手。

在三、二、一之后,

我们将看到一个身临其境的
全息图,

一个非常逼真的全息图
出现在我面前,

是我现在戴
在头上的眼镜。

当然,这可能是
我们正在购买

或学习的任何东西

,我可以用我的

双手很好地
控制它移动它。

我认为钢铁侠会感到自豪。

我们稍后再
讨论这个问题。

(掌声)

现在,如果你和我一样,
你的头脑已经

在思考我们可以用这种技术做什么的可能性

所以让我们看一些。

我妈妈是一名建筑师,

所以我自然而然地想到的第一件事就是

在 3D 空间中布置建筑物,

而不必使用
这些 2D 平面图。

她现在实际上正在触摸图形

并选择室内装饰。

这一切都是
通过我们的眼镜通过 GoPro 拍摄的。

下一个用例
对我来说非常个人化,

它是 Adam Gazzaley 教授的
玻璃大脑项目,

由 UCSF 提供。

作为一名神经科学专业的学生,

我总是幻想

着能够用一台真实的机器学习和记忆
这些复杂的大脑结构

在那里我可以触摸和
玩各种大脑结构。

现在你所看到的
被称为增强现实,

但对我来说,这
是一个更重要的

故事的一部分——一个关于我们如何
开始使用数字设备扩展我们的身体的故事,

而不是相反。

现在……我认为

,在接下来的几年里,人类
将经历一次转变。

我们将开始在现实世界中放置
一整层数字信息

试想一下,

这对讲故事的人

、画家

、脑外科医生

、室内装潢师

,也许对我们今天在这里的所有人来说意味着什么。

我认为
作为一个社区,我们需要做的

是真正尝试并

努力想象我们如何

以一种扩展人类体验的方式创造这个新现实

而不是把我们的现实游戏化

或用数字信息把它弄得一团糟。

这就是我非常热衷的事情。

现在,我想告诉你一个小秘密。

大约五年后——

这不是最小的设备

——大约五年后,

这些都将
在我们的眼睛

上看起来像投射全息图的玻璃条。

就像我们在硬件方面不太
关心我们购买哪款手机

——我们
为操作系统购买它——

作为一名神经科学家

,如果你愿意的话,我一直梦想着构建大脑中的 iOS。

正确地做到这一点非常非常重要,因为至少只要我们使用 Windows 图形用户界面,

我们就可能生活
在这些东西

我不了解你,

但生活在 Windows 中让我害怕。

(笑声)

为了将单个最直观的
界面从无限中分离出来,

我们使用神经科学来驱动
我们的设计指南,

而不是让一群设计师
在会议室里争吵。

我们所围绕的原则

就是所谓的“
最小阻力的神经路径”。

每一次,我们都
将大脑的 iOS 与我们的大脑连接

起来,这是我们大脑的第一次。

换句话说,我们正在尝试创建
一个零学习曲线计算机。

我们正在构建一个
您一直都知道如何使用的系统。

以下是
我们

在这种全新的用户体验形式中采用的前三个设计指南。

首先,
你是操作系统。

传统的文件
系统复杂而抽象

,它们需要你的大脑
额外的步骤来解码它们。

我们正在反对
最小阻力的神经路径。

同时,在增强现实中,

你当然可以把
你的全息 TED 面板放在这里,

把你的全息电子邮件
放在桌子的另一边

,你的空间记忆进化得很好
,可以继续检索它们。

你可以
把你正在购买的全息特斯拉 -

或者我的法律团队
告诉我在演出前放入的任何型号。

(笑声)

完美。 你的大脑确切
地知道如何取回它。

第二个界面指南
我们称之为“触摸查看”。

当婴儿看到引起他们兴趣的东西时,他们会怎么做

他们试图伸出手去触摸它。

这正是自然
机器应该如何工作的方式。

事实证明,视觉系统

从我们称之为本体感觉的感觉中获得了根本性的提升——


是我们身体部位在空间中的感觉。

因此,通过直接接触我们的工作
,我们不仅可以更好地控制它,

而且还可以更深入地理解
它。

因此,触摸查看。


我们自己亲身体验是不够的。

我们天生就是这些社会灵长类动物。

这将我引向了我们的第三条准则,


我们第一个故事中的全息篝火。

我们的镜像神经元子系统

表明,如果我们能以 3D 形式看到彼此的脸和手,我们就可以更好地相互联系,
并更好地完成我们的工作

所以如果你看我身后的视频,

你会看到两个 Meta 用户
在玩同一个全息图,

进行眼神交流,
围绕这个东西连接,

而不是
被外部设备分心。

让我们继续,
在考虑神经科学的情况下再试一次。

再说一遍,我们最喜欢的界面,
心灵的 iOS。

我现在要更进一步

,继续拿起这副眼镜

,把它放在桌边。

我现在和你在一起,我在当下,

我们正在连接。

我的空间记忆启动了
,我可以继续抓住它

并把它带回这里,

提醒我我是操作系统。

现在我的本体感觉正在工作

,我可以继续将
这些眼镜分解成一千个部分,

并触摸
当前正在扫描我手的传感器。

但是光看东西是不够的,

所以一会儿,我的联合创始人
雷要打一个 3D 电话——

雷?

(铃声)

嘿,雷,最近怎么样?

伙计们,我可以
以全 3D 形式看到我面前的这个人。

而且他是照片般逼真的。

(掌声)

谢谢。

我的镜像神经元子系统
表明这将

在不久的将来取代手机。

雷,怎么样?

雷:太好了。 我们今天直播。

(掌声)

MG:Ray,给大家一份

我们之前在视频中看到的全息大脑的礼物。

伙计们,这不仅
会改变手机

,还会改变
我们协作的方式。

太感谢了。

谢谢,雷。

雷:不客气。

(掌声)

MG:伙计们,这是
我在 2011 年在那间酒吧里发现的信息:

计算机的未来并没有
锁定在这些屏幕中。

它就在这里,在我们的内心。

(掌声)

所以如果今天我可以把一个想法
留在这里,

那就是自然机器
不是未来的虚构,

它就在 2016 年。

这就是为什么我们在 Meta 的数百人,

包括行政人员

、高管

、设计师、工程师——

在 TED2017 之前,

我们都将
扔掉我们的外接显示器

,取而代之的是一台真正
更自然的机器。

非常感谢你。

(鼓掌)

谢谢,谢谢。

多谢你们。

克里斯安德森:所以帮助
我做一件事,

因为

在过去一年左右的时间里已经展示了一些增强现实演示。

技术专家之间有时会

争论,我们真的
在屏幕上看到了真实的东西吗?

有一个视野问题

,不知何故,这项
技术显示出比你戴着眼镜实际看到的更广阔的视野

我们在那里看到了真正的交易吗?

MG:绝对是真正的交易。

不仅如此,

我们还采取了额外措施,
通过您在此处看到的各种视频中的实际镜头使用 GoPro 进行拍摄

我们想尝试模拟我们通过眼镜实际看到
的世界的体验

而不是偷工减料。

CA:非常感谢你向我们展示了这一点。

MG:非常感谢,我很感激。