Meet Spot the robot dog that can run hop and open doors Marc Raibert

(Laughter)

(Laughter)

That’s SpotMini.

He’ll be back in a little while.

I –

(Applause)

I love building robots.

And my long-term goal is to build robots

that can do what people and animals do.

And there’s three things in particular

that we’re interested in.

One is balance and dynamic mobility,

the second one is mobile manipulation,

and the third one is mobile perception.

So, dynamic mobility and balance –

I’m going to do a demo for you.

I’m standing here, balancing.

I can see you’re not very impressed.
OK, how about now?

(Laughter)

How about now?

(Applause)

Those simple capabilities mean that people
can go almost anywhere on earth,

on any kind of terrain.

We want to capture that for robots.

What about manipulation?

I’m holding this clicker in my hand;

I’m not even looking at it,

and I can manipulate it
without any problem.

But even more important,

I can move my body while I hold
the manipulator, the clicker,

and stabilize and coordinate my body,

and I can even walk around.

And that means
I can move around in the world

and expand the range
of my arms and my hands

and really be able to handle
almost anything.

So that’s mobile manipulation.

And all of you can do this.

Third is perception.

I’m looking at a room
with over 1,000 people in it,

and my amazing visual system
can see every one of you –

you’re all stable in space,

even when I move my head,

even when I move around.

That kind of mobile perception
is really important for robots

that are going to move and act

out in the world.

I’m going to give you
a little status report

on where we are in developing robots
toward these ends.

The first three robots are all
dynamically stabilized robots.

This one goes back
a little over 10 years ago –

“BigDog.”

It’s got a gyroscope
that helps stabilize it.

It’s got sensors and a control computer.

Here’s a Cheetah robot
that’s running with a galloping gait,

where it recycles its energy,

it bounces on the ground,

and it’s computing all the time

in order to keep itself
stabilized and propelled.

And here’s a bigger robot

that’s got such good
locomotion using its legs,

that it can go in deep snow.

This is about 10 inches deep,

and it doesn’t really have any trouble.

This is Spot, a new generation of robot –

just slightly older than the one
that came out onstage.

And we’ve been asking the question –

you’ve all heard about drone delivery:

Can we deliver packages
to your houses with drones?

Well, what about plain old
legged-robot delivery?

(Laughter)

So we’ve been taking our robot
to our employees' homes

to see whether we could get in –

(Laughter)

the various access ways.

And believe me, in the Boston area,

there’s every manner
of stairway twists and turns.

So it’s a real challenge.

But we’re doing very well,
about 70 percent of the way.

And here’s mobile manipulation,

where we’ve put an arm on the robot,

and it’s finding its way through the door.

Now, one of the important things
about making autonomous robots

is to make them not do
just exactly what you say,

but make them deal with the uncertainty
of what happens in the real world.

So we have Steve there,
one of the engineers,

giving the robot a hard time.

(Laughter)

And the fact that the programming
still tolerates all that disturbance –

it does what it’s supposed to.

Here’s another example,
where Eric is tugging on the robot

as it goes up the stairs.

And believe me,

getting it to do what it’s supposed to do
in those circumstances

is a real challenge,

but the result is something
that’s going to generalize

and make robots much more autonomous
than they would be otherwise.

This is Atlas, a humanoid robot.

It’s a third-generation humanoid
that we’ve been building.

I’ll tell you a little bit
about the hardware design later.

And we’ve been saying:

How close to human levels
of performance and speed could we get

in an ordinary task,

like moving boxes around on a conveyor?

We’re getting up to about two-thirds
of the speed that a human operates

on average.

And this robot is using both hands,
it’s using its body,

it’s stepping,

so it’s really an example
of dynamic stability,

mobile manipulation

and mobile perception.

Here –

(Laughter)

We actually have two Atlases.

(Laughter)

Now, everything doesn’t go exactly
the way it’s supposed to.

(Laughter)

(Laughter)

(Laughter)

And here’s our latest robot,
called “Handle.”

Handle is interesting,
because it’s sort of half like an animal,

and it’s half something else

with these leg-like things and wheels.

It’s got its arms on
in kind of a funny way,

but it really does some remarkable things.

It can carry 100 pounds.

It’s probably going to lift
more than that,

but so far we’ve done 100.

It’s got some pretty good
rough-terrain capability,

even though it has wheels.

And Handle loves to put on a show.

(Laughter)

(Applause)

I’m going to give you
a little bit of robot religion.

A lot of people think that a robot
is a machine where there’s a computer

that’s telling it what to do,

and the computer is listening
through its sensors.

But that’s really only half of the story.

The real story is
that the computer is on one side,

making suggestions to the robot,

and on the other side
are the physics of the world.

And that physics involves gravity,
friction, bouncing into things.

In order to have a successful robot,

my religion is that you have to do
a holistic design,

where you’re designing the software,
the hardware and the behavior

all at one time,

and all these parts really intermesh
and cooperate with each other.

And when you get the perfect design,
you get a real harmony

between all those parts
interacting with each other.

So it’s half software and half hardware,

plus the behavior.

We’ve done some work lately
on the hardware, where we tried to go –

the picture on the left
is a conventional design,

where you have parts
that are all bolted together,

conductors, tubes, connectors.

And on the right
is a more integrated thing;

it’s supposed to look like
an anatomy drawing.

Using the miracle of 3-D printing,

we’re starting to build parts of robots

that look a lot more
like the anatomy of an animal.

So that’s an upper-leg part
that has hydraulic pathways –

actuators, filters –

all embedded, all printed as one piece,

and the whole structure is developed

with a knowledge of what the loads
and behavior are going to be,

which is available from data
recorded from robots

and simulations and things like that.

So it’s a data-driven hardware design.

And using processes like that,

not only the upper leg
but some other things,

we’ve gotten our robots to go from big,
behemoth, bulky, slow, bad robots –

that one on the right,
weighing almost 400 pounds –

down to the one in the middle
which was just in the video,

weighs about 190 pounds,

just a little bit more than me,

and we have a new one,

which is working but I’m not
going to show it to you yet,

on the left,

which weighs just 165 pounds,

with all the same
strength and capabilities.

So these things are really getting
better very quickly.

So it’s time for Spot to come back out,

and we’re going to demonstrate
a little bit of mobility,

dexterity and perception.

This is Seth Davis,
who’s my robot wrangler today,

and he’s giving Spot
some general direction

by steering it around,

but all the coordination
of the legs and the sensors

is done by the robot’s computers on board.

The robot can walk
with a number of different gaits;

it’s got a gyro,

or a solid-state gyro,

an IMU on board.

Obviously, it’s got a battery,
and things like that.

One of the cool things
about a legged robot is,

it’s omnidirectional.

In addition to going forward,
it can go sideways,

it can turn in place.

And this robot
is a little bit of a show-off.

It loves to use its dynamic gaits,

like running –

(Laughter)

And it’s got one more.

(Laughter)

Now if it were really a show-off,
it would be hopping on one foot,

but, you know.

Now, Spot has a set of cameras
here, stereo cameras,

and we have a feed up in the center.

It’s kind of dark out in the audience,

but it’s going to use those cameras
in order to look at the terrain

right in front of it,

while it goes over
these obstacles back here.

For this demo, Seth is steering,

but the robot’s doing
all its own terrain planning.

This is a terrain map,

where the data from the cameras
is being developed in real time,

showing the red spots,
which are where it doesn’t want to step,

and the green spots are the good places.

And here it’s treating
them like stepping-stones.

So it’s trying to stay up on the blocks,

and it adjusts its stride,

and there’s a ton of planning

that has to go into
an operation like that,

and it does all
that planning in real time,

where it adjusts the steps
a little bit longer

or a little bit shorter.

Now we’re going to change it
into a different mode,

where it’s just going to treat
the blocks like terrain

and decide whether to step up or down

as it goes.

So this is using dynamic balance

and mobile perception,

because it has to coordinate what it sees
along with how it’s moving.

The other thing Spot has is a robot arm.

Some of you may see that
as a head and a neck,

but believe me, it’s an arm.

Seth is driving it around.

He’s actually driving the hand
and the body is following.

So the two are coordinated
in the way I was talking about before –

in the way people can do that.

In fact, one of the cool things
Spot can do we call, “chicken-head mode,”

and it keeps its head
in one place in space,

and it moves its body all around.

There’s a variation of this
that’s called “twerking” –

(Laughter)

but we’re not going to use that today.

(Laughter)

So, Spot: I’m feeling a little thirsty.
Could you get me a soda?

For this demo,
Seth is not doing any driving.

We have a LIDAR on the back of the robot,

and it’s using these props
we’ve put on the stage

to localize itself.

It’s gone over to that location.

Now it’s using a camera that’s in its hand

to find the cup,

picks it up –

and again, Seth’s not driving.

We’ve planned out a path for it to go –

it looked like it was
going off the path –

and now Seth’s going
to take over control again,

because I’m a little bit chicken
about having it do this by itself.

Thank you, Spot.

(Applause)

So, Spot:

How do you feel about having just finished
your TED performance?

(Laughter)

Me, too!

(Laughter)

Thank you all,

and thanks to the team at Boston Dynamics,

who did all the hard work behind this.

(Applause)

Helen Walters: Marc,
come back in the middle.

Thank you so much.

Come over here, I have questions.

So, you mentioned the UPS
and the package delivery.

What are the other applications
that you see for your robots?

Marc Raibert: You know,
I think that robots

that have the capabilities
I’ve been talking about

are going to be incredibly useful.

About a year ago, I went to Fukushima

to see what the situation was there,

and there’s just a huge need

for machines that can go
into some of the dirty places

and help remediate that.

I think it won’t be too long until
we have robots like this in our homes,

and one of the big needs
is to take care of the aging

and invalids.

I think that it won’t be too long
till we’re using robots

to help take care of our parents,

or probably more likely,
have our children help take care of us.

And there’s a bunch of other things.

I think the sky’s the limit.

Many of the ideas
we haven’t thought of yet,

and people like you will help us
think of new applications.

HW: So what about the dark side?

What about the military?

Are they interested?

MR: Sure, the military has been
a big funder of robotics.

I don’t think the military
is the dark side myself,

but I think, as with all
advanced technology,

it can be used for all kinds of things.

HW: Awesome. Thank you so much.

MR: OK, you’re welcome.

Thank you.

(Applause)