The coming age of empathic computing

[Music]

thank you it’s so great to be here today

it’s really honored to be able to talk

to you all

so to start off with i want to tell a

story and i want you to imagine

one of the worst days you’ve ever had at

work

so this story started a few years ago in

south australia

and there was a young apprentice who was

new on the job

and his job was to go out and repair a

generator

and a power station so like like this

one here

so he went on site and when he was there

he made a mistake

and he sent what’s called a trip signal

down the line a trip signal is a signal

that goes from the generator

to um the main control center and it

lets them know

that the generator is about to fail and

so the generator automatically takes

itself offline so it doesn’t get damaged

in this case the generator was perfectly

fine but it took itself offline

and a thousand homes lost power

so he tried to fix the problem he called

up his boss at work and and tried to

talk through

on the phone but they he couldn’t really

understand the problem and so they had

to send somebody else who was more

experienced

to help him fix the problem so it was a

45 minute drive

from the main office to where the guy

was working

and when the more experienced person got

there the expert just fixed it in a few

minutes just pushed a few buttons and

flipped a few switches

so the thousand homes lost power for an

hour

now you might think that’s not a big

deal but it turns out in south australia

when the power company when when homes

lose power

the government finds the power company

and that costs them

a quarter of a million dollars

however it could have been a lot

different imagine if there was

technology

that allowed the um young worker on site

to share his view of the generator with

the remote person

and the remote person could have talked

him through how to fix that

without having to drive all the way out

there so today i’m going to talk about

new technology that allows you to share

your view with somebody else

have you ever heard the phrase sometimes

i just wish i could get inside your head

or i could just wish

i could see things from your perspective

so what i want to talk about today is is

technology that allows

this to become hap to make happen and

how this could change

how we collaborate and communicate with

other people

so if communication of course is very

vital john pearce says that

communication is not only the essence of

being human

but also a vital property of life and

over a thousand thousands of years we

develop new ways to communicate all the

way from cave paintings to the telephone

to the modern social networks we have

however

when you’re talking to somebody on the

phone or you’re looking at them on a

zoom call or maybe even in a virtual

reality environment

seeing them as a character you’re always

looking at them face to face

and it’s really hard to see from

somebody else’s perspective

when you’re seeing um them in the face

in fact one of the goals of video

conferencing is to make people feel like

they’re in the same space and sometimes

it said

to make people feel like they’re being

there together but what wouldn’t be

better to go beyond being there and to

look at technology that can do much

better than what we can do with video

conferencing

and that’s what we’ve been developing in

our lab at the university of auckland

here

so this is one of the earliest projects

we did in this case you can see a person

with a headset on this on the head and

also with a small camera

on top and with this camera he can share

his view with a remote person

and so in this video you can see on the

right hand side

the remote user is seeing the video and

drawing on it with his mouse

on the left hand side you can see the

view through the headset and we’re using

augmented reality to be able to overlay

that person’s annotation onto the real

world

now imagine if our young apprentice had

this technology in the field

he could have saved himself quarter of a

million dollars by being able to have

the remote expert

exactly point at what he needed to do to

fix it and of course this technology

could be used in many different ways you

could imagine a surgeon using this to

share a view of an operating theatre

with remote expert

or maybe even a granddaughter going to a

museum and showing her grandmother

these amazing pictures she’s seeing

however one limitation with this system

is that it only shares what the person’s

looking in front of them

and so more recently we developed this

system here where you can have a 360

camera on your head

and you can live stream a 360 camera

view to a remote person

who’s in a virtual reality headset and

this means the remote person now can

look wherever they want

and can see your surroundings not only

that on the remote person’s side

we have technology to capture your hand

gestures

and send them back so on the person on

the left they will see these kind of

ghost hands floating in front of them

showing them how to do a task so you can

see a video that’s working here

again with the power station example so

here’s a person in the power station

and control room and looking through a

pair of augmented reality glasses with

our system on his head

and inside his glasses he sees this

ghost hand appearing in front of him

and then with the ghost hand you can now

draw

on the real world and you can see those

drawings appearing in front of you

the green square there shows where the

remote person is looking so both people

know

whether or not they’re looking in the

same direction or not and as i said

before

the video from the person’s head gets

streamed to the second user the expert

in virtual reality and this person now

feels like

he’s standing in the same body as the

person sending the video

so wherever he looks you can see the um

the view of that of that person

so with this system again our young

apprentice could have easily solved the

problem without having to have the

power go out for a whole hour

however for the remote person it feels

like they’re standing in a video of the

real world doesn’t really feel like

they’re in the real world

so most recently we’ve developed this

system here where we can

chain together a number of 3d cameras

and each of these cameras create a point

cloud

of the part of the real world and we can

stitch it all together and in real time

we can create this view on the right

hand side which is a 3d model of the

real world

so this video here shows that working on

the left hand side is a godzilla

perspective the right hand side is the

first person perspective

and you can see here a live view of uh

one of our workspaces

and the minute my student’s gonna walk

into the space and you see them walking

in now as this 3d

model now of course there’s some

limitations

with the blue is that’s where cameras

can’t see so there’s some empty gaps but

you can see how this is now making steps

towards

not just sharing a 360 video but sharing

a whole

3d environment so in a few years

we’ve gone from sharing 2d video to 360

to now 3d

and that has now dramatically increased

the immersiveness and how immersed you

feel in that space

and builds better scene understanding

and also means you can collaborate

better with the other person

so you could imagine maybe in a few

years time would have a small handheld

device and i could hold it up like this

and i could live stream my 3d

surroundings to anybody else anywhere in

the world

so for example you could be a mountain

biker competing in the olympics

you could wear a device like this on

your helmet you could live stream your

view to other people all over the world

and the people in

other places could be seeing on their

couches and feeling what it’s like to be

the mountain biker

biking down the mountain however seeing

what someone else is seeing isn’t enough

you always want to be you also want

about to know what they’re

they’re feeling and so we’ve also done

some experiments where you can add

sensors to the environment so in this

case here we’ve got a person

and a vr headset they’re wearing a

special glove that measures their heart

rate

and then we developed a system where

they could share their view with another

person and you can see on the right hand

side

one person inside virtual reality

playing a game we can share their heart

rate with another person

and it turns out when this person

playing the game gets excited the other

person starts feeling more excited as

well

and we can even fake it we can make the

artificially enhance the person’s heart

rate and the other person still

feels more excited so we can share the

feelings of one person

with another more recently we developed

this system

in this case we’ve added an eeg cap so

we can measure

your brain activity and also special

sensors that are on the face plate

of the vr display and so we can now use

this technology to measure your brain

activity

while you’re inside virtual reality and

we can use machine learning

to understand the emotions you’re

feeling and have the virtuality respond

to those emotions

one of the most exciting things with

brain activity is it turns out about

10 years ago people discovered this

phenomenon called brain synchronization

it turns out if you have two people in

the real world doing the same task if

you measure the brain activity

sometimes their brain starts to

synchronize with the other person the

phase of the brain waves start to

synchronize the other person

and when this happens you enter what’s

called a flow state and people feel like

they’re working together more

efficiently

and communicating better you may have

heard the phrase well i feel just like

i’m in sync with somebody else

well it turns out sometimes your brain

really is

and so this has been shown a number of

real world activities in this case for

example

people are doing a finger tracking

exercise where you put your finger out

the other person puts their finger out

and you track around in space and what

happens is this

so this is the two brains of the two

people the black dots so the

eeg electrodes this is before they start

the activity they do

finger tracking for a while then they

stop and this is their brain activity

afterwards and you can see these little

arcs here

these show the two electrodes that are

connected together and in phase

so now these people feel like they’re

more synchronized

so this has been shown to happen in the

real world but until now nobody’s done

it inside virtual reality so we did the

same thing in virtual reality so here’s

two people

sitting inside virtual reality and

inside the vr they can see themselves

like this so they can see

the other person and be pointing at them

and they can do that finger tracking

activity

now of course the thing with virtual

reality is you can do things you can’t

do in the real world

so for example one of those things is

you can put yourself in somebody else’s

body

so this is that view there so now when i

look down i can see two pairs of hands

coming out of my body

and i can do the same finger tracking

activity but with somebody else’s hands

and it turns out when you do this you

get even more brain synchronization

so this is the two brains it’s a bit

busy chart but you can see the lines

going through the electrodes

and this is before they start doing the

activity and the the width

and the color of the line shows how many

connections there’s a few connections

between the

two brains but when we finish the

activity um when i see from the other

person’s perspective you can see these

connections here and the big red lines

show now there’s very strong connection

between the two brains

so so far when we talked we’ve gone um

we’ve shown how you can use augmented

and virtual reality

to um create this brain synchronization

and brain connection

and look at the world from other

people’s perspective well of course

advanced and technologies don’t stop and

over the next few years you’ll see even

more trends

like this happening a couple of very

important trends first of all the trend

towards experience capture

so we’ll go from being able to share

faces of other people

to be able to share places you know a

decade ago i could have had a video

conference with somebody else

and now i can live stream my view with a

360 video

a second important trend is faster

networks and

faster networks means better bandwidth

or more bandwidth for better

communication

20 years ago i was using dial-up modem

to use computers and now with my

apartment here in auckland i’ve got a

gigabit

fiber connection so i can stream high

definition video of myself to the world

and the third trend is towards implicit

understanding and this basically means

systems that recognize our behavior and

emotion so for example i can push a

button

on my phone and i can talk to siri and

sorry understands what i’m saying

or a camera can understand what i’m

doing or my expressions on my face

so these three trends together natural

collaboration experience capture

and implicit understanding all overlap

in this area that we call empathic

computing and so the goal of empathic

computing

basically is to develop systems that

allow us to share what we are seeing

hearing and feeling with others so over

the

overall this means that we’ve got this

trend now towards what you might call

empathic tail existence you know with

new technology trends and display

technology

and capturing space and emotion searing

all these blend together

to create a new type of collaboration

and so tele-empathic child existence

means that now

you move from being an observer of

somebody else to being a participant

with them in the same space you can see

the world from their perspective

you can change communication from being

implicit to implicit communication you

can recognize their gestures or their

non-verbal cues

and most importantly you feel like

you’re doing things

together rather than watching somebody

do something else

so in the coming years we’ll have

technology that will allow us to know

what

other people are seeing hearing and

feeling

and for the first time we’ll be able to

truly see the world from somebody else’s

perspective

and know what’s like to get inside their

head empathic computing will really

change the way

we work and play with people forever