The coming age of empathic computing

[Music]

thank you it’s so great to be here today

it’s really honored to be able to talk

to you all

so to start off with i want to tell a

story and i want you to imagine

one of the worst days you’ve ever had at

work

so this story started a few years ago in

south australia

and there was a young apprentice who was

new on the job

and his job was to go out and repair a

generator

and a power station so like like this

one here

so he went on site and when he was there

he made a mistake

and he sent what’s called a trip signal

down the line a trip signal is a signal

that goes from the generator

to um the main control center and it

lets them know

that the generator is about to fail and

so the generator automatically takes

itself offline so it doesn’t get damaged

in this case the generator was perfectly

fine but it took itself offline

and a thousand homes lost power

so he tried to fix the problem he called

up his boss at work and and tried to

talk through

on the phone but they he couldn’t really

understand the problem and so they had

to send somebody else who was more

experienced

to help him fix the problem so it was a

45 minute drive

from the main office to where the guy

was working

and when the more experienced person got

there the expert just fixed it in a few

minutes just pushed a few buttons and

flipped a few switches

so the thousand homes lost power for an

hour

now you might think that’s not a big

deal but it turns out in south australia

when the power company when when homes

lose power

the government finds the power company

and that costs them

a quarter of a million dollars

however it could have been a lot

different imagine if there was

technology

that allowed the um young worker on site

to share his view of the generator with

the remote person

and the remote person could have talked

him through how to fix that

without having to drive all the way out

there so today i’m going to talk about

new technology that allows you to share

your view with somebody else

have you ever heard the phrase sometimes

i just wish i could get inside your head

or i could just wish

i could see things from your perspective

so what i want to talk about today is is

technology that allows

this to become hap to make happen and

how this could change

how we collaborate and communicate with

other people

so if communication of course is very

vital john pearce says that

communication is not only the essence of

being human

but also a vital property of life and

over a thousand thousands of years we

develop new ways to communicate all the

way from cave paintings to the telephone

to the modern social networks we have

however

when you’re talking to somebody on the

phone or you’re looking at them on a

zoom call or maybe even in a virtual

reality environment

seeing them as a character you’re always

looking at them face to face

and it’s really hard to see from

somebody else’s perspective

when you’re seeing um them in the face

in fact one of the goals of video

conferencing is to make people feel like

they’re in the same space and sometimes

it said

to make people feel like they’re being

there together but what wouldn’t be

better to go beyond being there and to

look at technology that can do much

better than what we can do with video

conferencing

and that’s what we’ve been developing in

our lab at the university of auckland

here

so this is one of the earliest projects

we did in this case you can see a person

with a headset on this on the head and

also with a small camera

on top and with this camera he can share

his view with a remote person

and so in this video you can see on the

right hand side

the remote user is seeing the video and

drawing on it with his mouse

on the left hand side you can see the

view through the headset and we’re using

augmented reality to be able to overlay

that person’s annotation onto the real

world

now imagine if our young apprentice had

this technology in the field

he could have saved himself quarter of a

million dollars by being able to have

the remote expert

exactly point at what he needed to do to

fix it and of course this technology

could be used in many different ways you

could imagine a surgeon using this to

share a view of an operating theatre

with remote expert

or maybe even a granddaughter going to a

museum and showing her grandmother

these amazing pictures she’s seeing

however one limitation with this system

is that it only shares what the person’s

looking in front of them

and so more recently we developed this

system here where you can have a 360

camera on your head

and you can live stream a 360 camera

view to a remote person

who’s in a virtual reality headset and

this means the remote person now can

look wherever they want

and can see your surroundings not only

that on the remote person’s side

we have technology to capture your hand

gestures

and send them back so on the person on

the left they will see these kind of

ghost hands floating in front of them

showing them how to do a task so you can

see a video that’s working here

again with the power station example so

here’s a person in the power station

and control room and looking through a

pair of augmented reality glasses with

our system on his head

and inside his glasses he sees this

ghost hand appearing in front of him

and then with the ghost hand you can now

draw

on the real world and you can see those

drawings appearing in front of you

the green square there shows where the

remote person is looking so both people

know

whether or not they’re looking in the

same direction or not and as i said

before

the video from the person’s head gets

streamed to the second user the expert

in virtual reality and this person now

feels like

he’s standing in the same body as the

person sending the video

so wherever he looks you can see the um

the view of that of that person

so with this system again our young

apprentice could have easily solved the

problem without having to have the

power go out for a whole hour

however for the remote person it feels

like they’re standing in a video of the

real world doesn’t really feel like

they’re in the real world

so most recently we’ve developed this

system here where we can

chain together a number of 3d cameras

and each of these cameras create a point

cloud

of the part of the real world and we can

stitch it all together and in real time

we can create this view on the right

hand side which is a 3d model of the

real world

so this video here shows that working on

the left hand side is a godzilla

perspective the right hand side is the

first person perspective

and you can see here a live view of uh

one of our workspaces

and the minute my student’s gonna walk

into the space and you see them walking

in now as this 3d

model now of course there’s some

limitations

with the blue is that’s where cameras

can’t see so there’s some empty gaps but

you can see how this is now making steps

towards

not just sharing a 360 video but sharing

a whole

3d environment so in a few years

we’ve gone from sharing 2d video to 360

to now 3d

and that has now dramatically increased

the immersiveness and how immersed you

feel in that space

and builds better scene understanding

and also means you can collaborate

better with the other person

so you could imagine maybe in a few

years time would have a small handheld

device and i could hold it up like this

and i could live stream my 3d

surroundings to anybody else anywhere in

the world

so for example you could be a mountain

biker competing in the olympics

you could wear a device like this on

your helmet you could live stream your

view to other people all over the world

and the people in

other places could be seeing on their

couches and feeling what it’s like to be

the mountain biker

biking down the mountain however seeing

what someone else is seeing isn’t enough

you always want to be you also want

about to know what they’re

they’re feeling and so we’ve also done

some experiments where you can add

sensors to the environment so in this

case here we’ve got a person

and a vr headset they’re wearing a

special glove that measures their heart

rate

and then we developed a system where

they could share their view with another

person and you can see on the right hand

side

one person inside virtual reality

playing a game we can share their heart

rate with another person

and it turns out when this person

playing the game gets excited the other

person starts feeling more excited as

well

and we can even fake it we can make the

artificially enhance the person’s heart

rate and the other person still

feels more excited so we can share the

feelings of one person

with another more recently we developed

this system

in this case we’ve added an eeg cap so

we can measure

your brain activity and also special

sensors that are on the face plate

of the vr display and so we can now use

this technology to measure your brain

activity

while you’re inside virtual reality and

we can use machine learning

to understand the emotions you’re

feeling and have the virtuality respond

to those emotions

one of the most exciting things with

brain activity is it turns out about

10 years ago people discovered this

phenomenon called brain synchronization

it turns out if you have two people in

the real world doing the same task if

you measure the brain activity

sometimes their brain starts to

synchronize with the other person the

phase of the brain waves start to

synchronize the other person

and when this happens you enter what’s

called a flow state and people feel like

they’re working together more

efficiently

and communicating better you may have

heard the phrase well i feel just like

i’m in sync with somebody else

well it turns out sometimes your brain

really is

and so this has been shown a number of

real world activities in this case for

example

people are doing a finger tracking

exercise where you put your finger out

the other person puts their finger out

and you track around in space and what

happens is this

so this is the two brains of the two

people the black dots so the

eeg electrodes this is before they start

the activity they do

finger tracking for a while then they

stop and this is their brain activity

afterwards and you can see these little

arcs here

these show the two electrodes that are

connected together and in phase

so now these people feel like they’re

more synchronized

so this has been shown to happen in the

real world but until now nobody’s done

it inside virtual reality so we did the

same thing in virtual reality so here’s

two people

sitting inside virtual reality and

inside the vr they can see themselves

like this so they can see

the other person and be pointing at them

and they can do that finger tracking

activity

now of course the thing with virtual

reality is you can do things you can’t

do in the real world

so for example one of those things is

you can put yourself in somebody else’s

body

so this is that view there so now when i

look down i can see two pairs of hands

coming out of my body

and i can do the same finger tracking

activity but with somebody else’s hands

and it turns out when you do this you

get even more brain synchronization

so this is the two brains it’s a bit

busy chart but you can see the lines

going through the electrodes

and this is before they start doing the

activity and the the width

and the color of the line shows how many

connections there’s a few connections

between the

two brains but when we finish the

activity um when i see from the other

person’s perspective you can see these

connections here and the big red lines

show now there’s very strong connection

between the two brains

so so far when we talked we’ve gone um

we’ve shown how you can use augmented

and virtual reality

to um create this brain synchronization

and brain connection

and look at the world from other

people’s perspective well of course

advanced and technologies don’t stop and

over the next few years you’ll see even

more trends

like this happening a couple of very

important trends first of all the trend

towards experience capture

so we’ll go from being able to share

faces of other people

to be able to share places you know a

decade ago i could have had a video

conference with somebody else

and now i can live stream my view with a

360 video

a second important trend is faster

networks and

faster networks means better bandwidth

or more bandwidth for better

communication

20 years ago i was using dial-up modem

to use computers and now with my

apartment here in auckland i’ve got a

gigabit

fiber connection so i can stream high

definition video of myself to the world

and the third trend is towards implicit

understanding and this basically means

systems that recognize our behavior and

emotion so for example i can push a

button

on my phone and i can talk to siri and

sorry understands what i’m saying

or a camera can understand what i’m

doing or my expressions on my face

so these three trends together natural

collaboration experience capture

and implicit understanding all overlap

in this area that we call empathic

computing and so the goal of empathic

computing

basically is to develop systems that

allow us to share what we are seeing

hearing and feeling with others so over

the

overall this means that we’ve got this

trend now towards what you might call

empathic tail existence you know with

new technology trends and display

technology

and capturing space and emotion searing

all these blend together

to create a new type of collaboration

and so tele-empathic child existence

means that now

you move from being an observer of

somebody else to being a participant

with them in the same space you can see

the world from their perspective

you can change communication from being

implicit to implicit communication you

can recognize their gestures or their

non-verbal cues

and most importantly you feel like

you’re doing things

together rather than watching somebody

do something else

so in the coming years we’ll have

technology that will allow us to know

what

other people are seeing hearing and

feeling

and for the first time we’ll be able to

truly see the world from somebody else’s

perspective

and know what’s like to get inside their

head empathic computing will really

change the way

we work and play with people forever

[音乐]

谢谢你们今天能来到

这里真是太好了很荣幸能够和

你们所有人交谈

所以开始我想讲一个

故事我想让你们想象

一下你们曾经有过的最糟糕的日子之一 工作

所以这个故事开始于几年前在

南澳大利亚

,有一个年轻的学徒,他是

新来的

,他的工作是出去修理

发电机

和发电站,就像

这里的一样,

所以他继续 现场,当他在那里时,

他犯了一个错误

,他向线路发送了所谓的跳闸信号

跳闸信号

是从发电机

到主控制中心的信号,它

让他们

知道发电机即将发生故障

所以发电机会

自动脱机,因此

在这种情况下不会损坏发电机非常

好,但它自己脱机

,一千个家庭断电,

所以他试图解决这个问题,他

打电话给他的老板,然后 试图通过电话

交谈 ne 但是他们他不能真正

理解这个问题,所以他们

不得不派一个更有

经验的人

来帮助他解决问题,所以从总办公室到那个人工作的地方需要

45 分钟的车程

,当更多 有经验的人到了

那里 专家在

几分钟内就修好了 只需按几个按钮并

拨动几个开关 就让

数千户人家停电一个

小时

现在你可能认为这没什么

大不了 但事实证明在南

澳大利亚 电力公司,当家庭

断电时

,政府找到了电力公司

,这将花费他们

一百万美元,

但是如果有

技术

可以让现场的年轻工人

分享他对 发电机

和远程的人

,远程的人可以告诉

他如何解决这个问题,

而不必一路开车出去

,所以今天我要谈谈

新技术 允许你

与其他人分享你的观点

你有没有听过这个短语 有时

我只是希望我能进入你的脑海

或者我只是希望

我能从你的角度看待事物

所以我今天要谈论的是

技术

这很可能会发生,

这将如何改变

我们与他人合作和交流的方式,

所以如果交流当然是非常

重要的,约翰·皮尔斯说,

交流不仅是人类的本质,

而且是生命的重要属性,甚至更多

一千年以来,我们

开发了新的交流

方式,从洞穴壁画到电话

再到我们拥有的现代社交网络,

但是

当您与某人通过

电话交谈或您在缩放通话中看着他们时,

或者 甚至在虚拟

现实环境

中将他们视为一个角色,你总是面对面地看着他们,当你看到他们时

,真的很难从

别人的角度看

事实上,视频会议的目标之一

是让人们觉得

他们在同一个空间,有时

说让人们觉得

他们在一起,但没有

更好的办法 超越那里,

看看

比视频会议做得更好的技术

,这就是我们在奥克兰大学的实验室一直在开发的,

所以这是我们最早的项目之一

在这种情况下,您可以看到一个人

头上戴着耳机,顶部

还有一个小

摄像头,使用这个摄像头,他可以

与远程人分享他的视图

,所以在这个视频中,您可以在右侧看到

远程用户正在观看视频

并用左侧的鼠标在视频上绘图,

您可以

通过耳机看到视图,我们正在使用

增强现实技术

将该人的注释叠加到现实

世界中,

现在想象一下,如果我们的 年轻的学徒 ce

在该领域拥有这项技术,

可以通过

让远程专家

准确指出他需要做些什么来

修复它,从而为自己节省了 25 万美元,当然,这项技术

可以以多种不同的方式使用,你

可以 想象一个外科医生使用它

与远程专家分享手术室的视图,

或者甚至是一个孙女去

博物馆向她的祖母

展示她所看到的这些惊人的照片,

但是这个系统的一个限制

是它只分享这个人

正在看的东西 在他们面前

,所以最近我们在这里开发了这个

系统,您可以在头上安装一个 360 度

摄像头

,您可以将 360 度摄像头视图实时流式传输给

戴着虚拟现实耳机

的远程人,这意味着远程人现在可以

看 无论他们想去哪里

,都可以看到您的周围环境,

不仅在远程人员的一侧,

我们还拥有技术来捕捉您的手势

并将它们发送回来等等 左边的人,

他们会看到这些

幽灵手漂浮在他们面前,向

他们展示如何完成一项任务,这样你就可以

看到一个在这里

再次使用电站示例的视频,所以

这里有一个人在电站

和控制室 透过

一副增强现实眼镜,

头上戴上我们的系统

,眼镜里面,他看到这只

幽灵手出现在他面前

在你面前

的绿色方块显示

远程人正在看的地方,所以两个人都

知道

他们是否在看

同一个方向,正如我之前所说的那样

来自该人头部的视频被

流式传输给第二个用户

虚拟现实专家,这个人现在

感觉

他和发送视频的人站在同一个身体里,

所以无论他看哪里,你都可以看到

那个人的观点,

所以有了这个 再次系统,我们年轻的

学徒可以轻松解决

问题,而无需

断电整整一个小时,

但是对于远程人员来说,感觉

就像他们站在现实世界的视频中

并没有真正的感觉

' 重新在现实世界中,

所以最近我们在这里开发了这个

系统,我们可以

将多个 3d 摄像机链接在一起

,每个摄像机创建一个

真实世界部分的点云,我们可以

将它们拼接在一起并在 实时

我们可以在右侧创建这个视图,

这是现实世界的 3D 模型,

所以这里的视频显示,

在左侧工作是哥斯拉

视角,右侧是

第一人称视角

,你可以看到 这里是

我们的一个工作空间的实时视图,

当我的学生

走进这个空间的那一刻,你看到他们

现在走进这个 3d

模型,当然

,蓝色有一些限制,那就是相机无法拍摄的地方

e 所以有一些空白,但

你可以看到这现在是如何朝着

不仅仅是共享 360 视频而是

共享整个

3d 环境迈出的一步,所以在几年内,

我们已经从共享 2d 视频到 360

到现在的 3d

并且已经 现在显着提高

了沉浸感和沉浸

像这样

,我可以将我的 3d

环境直播给世界上任何其他人

,例如,你可以成为一名

山地车手参加奥运会比赛,

你可以在头盔上戴上这样的设备,

你可以

向其他人直播你的观点 世界各地

其他地方的人们可能会在他们的

沙发上看到并感受

作为山地车手

骑自行车下山的感觉,但

看到其他人 仅仅看到还不够

你总是想成为 你

还想知道

他们的感受 所以我们也做了

一些实验 你可以

在环境中添加传感器 所以在这种

情况下我们有 有一个

人和一个 VR 耳机,他们戴着一个

特殊的手套来测量他们的心率

,然后我们开发了一个系统,

他们可以与另一个人分享他们的观点

,你可以在右侧看到

一个人在虚拟现实中

播放 游戏我们可以与另一个人分享他们的心率

,事实证明,当

玩游戏的人变得兴奋时,其他

人也开始感到更兴奋

,我们甚至可以假装我们可以

人为地提高这个人的

心率和其他人 人们仍然

感到更兴奋,所以我们可以与另一个

人分享一个人的

感受 最近我们开发了

这个系统,

在这种情况下,我们添加了一个脑电图帽,这样

我们就可以测量

你的大脑活动和特殊

感觉 或者在虚拟

现实显示器的面板上,所以我们现在可以使用

这项技术来测量

你在虚拟现实中时的大脑活动,

我们可以使用机器学习

来理解你的

情绪并让虚拟做出反应

对于这些情绪来说

,大脑活动最令人兴奋的事情之一

是事实证明,大约

10 年前人们发现了这种

称为大脑同步的现象

,如果你有时测量大脑活动,如果你

在现实世界中有两个人在做同样的任务

他们的大脑开始

与另一个人同步

脑电波的相位开始

与另一个人同步

,当这种情况发生时,你会进入

所谓的心流状态,人们会觉得

他们在一起工作更有

效率

,沟通也更好,你可能

听说过 短语well i feel just like

I’m sync with others

well 事实证明有时你的大脑

确实

如此,所以这一直是

在这种情况下,拥有许多现实世界的活动,

例如

人们正在做一个手指追踪

练习,你伸出手指,

另一个人伸出手指

,你在太空中四处追踪,

发生了什么,

所以这是两个大脑 两个

人是黑点所以

脑电图电极这是在他们

开始活动之前他们进行

手指跟踪一段时间然后他们

停止这是他们之后的大脑活动

你可以在这里看到这些小

这些显示连接的两个电极

在一起并且同相,

所以现在这些人感觉他们

更加同步了,

所以这已被证明在

现实世界中发生,但直到现在还没有人

在虚拟现实中做过,所以我们在虚拟现实中做了

同样的事情,所以这里有

两个人

坐在里面 虚拟现实和在虚拟现实中,

他们可以像这样看到自己,

这样他们就可以

看到其他人并指向他们

,他们可以进行手指跟踪

活动 当然,虚拟

现实的好处是你可以做

你在现实

世界中做不到的事情,例如,其中一件事就是

你可以把自己放在别人的

身体里,

所以这就是那里的景色,所以现在当我

往下看时 我可以看到两双手

从我的身体里出来

,我可以做同样的手指跟踪

活动,但是用别人的手

,结果当你这样做时,你会

得到更多的大脑同步,

所以这是两个大脑,有点

忙 图表,但您可以看到

穿过电极的线条

,这是在它们开始进行

活动之前,线条的宽度

和颜色显示了两个大脑之间

有多少连接

但是当我们完成

活动时 我从另一个

人的角度来看,你可以

在这里看到这些联系,大红线

显示现在两个大脑之间的联系非常紧密

,到目前为止,当我们交谈时,我们已经走了,嗯,

我们已经展示了你是如何做到的 e 增强

和虚拟现实

,嗯,创造这种大脑同步

和大脑连接

,从

别人的角度看世界当然

先进,技术不会停止,

在接下来的几年里,你会看到

更多这样的趋势

发生 几个非常

重要的趋势首先是

体验捕捉的趋势,

所以我们将从能够分享

其他人的面孔

到能够分享你十年前知道的地方,

我本可以

与其他人进行视频会议

,现在 我可以用

360 度视频直播我

的视图 第二个重要趋势是更快的

网络和

更快的网络意味着更好的带宽

或更多的带宽以实现更好的

通信

20 年前我使用拨号调制解调器

来使用电脑,现在我

在奥克兰的公寓 我有一个

千兆

光纤连接,所以我可以

将自己的高清视频流式传输到世界

,第三个趋势是隐式

理解和这个基本的 ly

表示识别我们的行为和

情绪的系统,例如,我可以按下

手机上的按钮,我可以与 siri 交谈,并且

抱歉理解我在说什么,

或者相机可以理解我在

做什么或我脸上的表情

所以这三个趋势一起自然

协作体验捕获

和隐式理解

在我们称之为移情计算的这个领域中都重叠

所以移情计算的目标

基本上是开发

允许我们与他人分享我们所看到的

听到和感受到的系统

总的来说,这意味着我们现在已经有了这种

趋势,你可以称之为

移情尾巴存在,你知道

新技术趋势和显示

技术

,捕捉空间和情感,将

所有这些融合在一起

,创造一种新型的协作

,所以远程- 善解人意的孩子存在

意味着现在

您从

其他人的观察者转变为

与他们在同一个空间中的参与者

从他们的角度看世界

你可以将沟通从

隐含转变为隐含沟通 你

可以识别他们的手势或他们的

非语言暗示

,最重要的是你觉得

你们

在一起做事,而不是看着别人

做其他事情

所以在未来 几年后,我们将拥有

能够让我们知道

其他人所看到的听觉和

感受的技术,

并且我们将第一次能够

真正从别人的角度看世界,

并知道进入他们的大脑是什么感觉

移情计算 真的会

永远改变我们工作和与人一起玩耍的方式