How Benjamin Button got his face Ed Ulbrich

I’m here today representing a team of

artists and technologists and filmmakers

that work together on a remarkable film

project for the last four years and

along the way they created a

breakthrough in computer visualization

so I want to show you a clip of the film

now hopefully it won’t stutter and if we

did our jobs well you won’t know that we

were even involved but you seem to have

more uh-huh

what if I told you I wasn’t getting

older I was getting younger than

everybody else I was born with some form

of disease

what kinda C’s Osborne all

I’m sorry no need to be this at the

wrong old age

are you sick I heard mama and tizzy Miss

penny said I was gonna die soon but

maybe not it different than anybody I’ve

ever met

there were many changes some you could

see some you couldn’t hey I started

growing in all sorts of places along

with other things I felt pretty good

considering that was a clip from The

Curious Case of Benjamin Button many you

maybe you’ve seen it or you’ve heard of

the story but what you might not know is

that for nearly the first hour of the

film the main character Benjamin Button

who’s played by Brad Pitt is completely

computer-generated from the neck up now

there’s no use of prosthetic makeup or

photography of Brad superimposed over

the over another actors body we’ve

created a completely digital human head

so I’d like to start with a little bit

of history on the project this is based

on an F scott Fitzgerald short story

it’s about a man who’s born old and

lives his life in Reverse now this

movies floated around Hollywood for well

over half a century that we first got

involved the project in the early 90s

with Ron Howard as the director we took

a lot of meetings and we seriously

considered it and at the time we had to

throw in the towel it was deemed

impossible it was beyond the technology

of the day to depict a man aging and

back backwards the the human form in

particularly human head has been

considered the holy grail of our

industry the project came back just

about a decade later and this time with

a director named David Fincher now

Fincher is an interesting guy

David is fearless of Technology and he

is absolutely tenacious and David won’t

take no and David believed like we do in

the visual effects industry that

anything is possible as long as you have

enough time resources and of course

money and so David had an interesting

take on the film and he threw a

challenge at us he wanted the film to be

or the main character of the film to be

played from the cradle to the grave by

one actor it happened to be this guy we

went through a process of elimination

and a process of discovery with Dave and

we ruled out of course swapping actors

that was one idea that we would have

different actors we would hand off from

actor to actor and we’ve been ruled out

the idea of using makeup we realized

that prosthetic makeup just wouldn’t

hold

particularly in close-up and makeup is

an additive process you have to build

the face up and David one to carve

deeply into Brad’s face to bring the

aging to this character he needed to be

very sympathetic character so we decided

to cast a series of little people that

would play the different bodies of

Benjamin at different increments of his

life and that we would in fact create a

computer-generated version of Brad’s

head aged to appears Benjamin and attach

that to the body of the real actor

sounded great of course this was the

holy grail of our our industry and the

fact that this guy is a global icon

didn’t help either because I’m sure if

anybody ever stand in line at the

grocery store you know we see his face

constantly so there really was no

tolerable margin of error there were two

studios involved Warner Brothers and

Paramount and they both believed this

would make an amazing film of course it

was a very high-risk proposition there

was lots of money in reputations at

stake and we believed that we had a very

solid methodology that might work but

despite our verbal assurances they

wanted some proof and so in 2004 they

commissioned us to do a screen test of

Benjamin and we did it about five weeks

but we use lots of cheats and shortcuts

we basically put something together to

get through the meeting and I’ll roll

that for you now this was the first test

for Benjamin Button and in here you can

see it’s a that’s a computer-generated

head it’s pretty good

attached to the body of another actor

and it worked and it gave the studio

great relief after many years of starts

and stops on this project and and making

that tough decision they finally decided

to greenlight the movie and I can

remember actually when I got the phone

call to congratulate us that the the

move was a go I actually threw up

you know this is tough stuff so we

started to have early team meetings and

and we got everybody together and it was

really more like therapy in the

beginning of convincing each other and

reassuring each other that we can

actually undertake this we had to hold

up an hour of a movie with a character

and it’s not a special-effects film it

has to be a man we really felt like we

were you know kind of like a 12-step

program and of course you know the first

step is it mate you’ve got a problem so

um and we had a big problem we didn’t

know how we’re gonna do this and but we

did know one thing being from the visual

effects industry we with David believed

that we now had enough time enough

resources and God we hoped we had enough

money and we had enough passion to will

the processes and technology into

existence so when you’re faced with

something like that adversity I’ve got

to break it down you take the big

problem you break it down into smaller

pieces you can start to attack that so

we had three main areas that we had to

focus on we needed to make Brad look a

lot older we needed to age him 45 years

or so and we also needed to make sure

that we could take Brad’s idiosyncrasies

his little ticks the little subtleties

that make him who he is and have that

translate through our process so that it

appears in Benjamin on the screen and we

also needed to create a character that

could hold up under really all

conditions you needed to be able to walk

in broad daylight at nighttime under

candlelight he had to hold an extreme

closeup he had to deliver dialogue he

had to be able to run he had to be able

to sweat he had to be able to take a

bath to cry he even had to throw up not

all at the same time but he had to you

know do all of those those things and

the work had to hold up for almost the

first hour of the movie we did about 325

shots so we needed a system that would

allow Benjamin to be able to do

everything a human being can do and we

realized that there was a giant chasm

between the state of the art of

Technology in about 2004 and where we

needed it to be so we focused on motion

capture now I’m sure many of you have

seen motion capture and and the state of

the art at the time was something called

marker based motion capture I’ll show

you an example here it’s basically the

idea of you wear a leotard and they put

some reflective markers on your body and

instead of using cameras there’s

infrared sensors around a volume and

those infrared sensors track the three

position of those markers in real time

and then animators can take the data of

the motion of those markers and apply

them to a computer-generated character

so you can see the computer characters

on the right are having the same complex

motion as the as the dancers so we also

looked at numbers of other films of the

time that were using facial marker

tracking and that’s the idea of putting

markers on the human face and doing the

same process and as you can see it gives

you a pretty crappy performance that’s

not terribly compelling and what we

realized was that what we needed was the

information what was going on between

the markers we needed the subtleties of

the skin we needed to see skin moving

over muscle over bone heat increases in

dimples and wrinkles and all of those

things so our first big revelation was

to completely abort and walk away from

the technology of the day the status quo

the state of the art and so we aborted

using motion capture and we were now

well out of our comfort zone and an

uncharted territory so we were left with

this idea that we ended up kind of

calling technologies Stu we started to

look out in other fields and the idea

was that we were going to find nuggets

or gems of technology that perhaps come

from other industries like medical

imaging or the video game space and

Andrea propriate them and we had to

create kind of a sauce and the sauce was

a code and software that we written to

allow these disparate pieces of

technologies to come together and work

as one and so initially we came across

some remarkable research done by a

gentleman named dr. Paul Ekman in the

early 70s and he believed that he could

in fact catalog the human face and he

came up with his idea of facial action

coding system he believed that there are

basically 70 basic poses or shapes of

the human face and that from those basic

poses or shapes of the face they can be

combined to create infinite

possibilities of everything the human

face is capable of doing and of course

these transcend age race culture gender

and so this was a became kind of the

foundation of our research as we went

forward and then we came across some

remarkable technology called contour and

here you can see a subject having

phosphorescent makeup stippled on her

face and now what we’re looking at is

really of creating a surface capture as

opposed to a marker capture the subject

stands in front of a computer array

cameras and those cameras can

frame-by-frame reconstruct geometry of

exactly what the subject is doing at the

moment right so effectively you get 3d

data in real time of the subject and if

you look in a comparison on the left we

see what volumetric data gives us and on

the right you see what markers give us

so clearly we were in a substantially

better place for this but these were the

early days of this technology and it

wasn’t really proven yet but you know we

measure complexity and fidelity of data

in terms of polygonal count and so on

the left we were seeing a hundred

thousand polygons we can go up into the

millions of polygons it seemed to be

infinite this is when we had our AHA

this was the breakthrough this is one

like okay we’re gonna be okay this is

actually gonna work and the AHA was what

if we could take Brad Pitt and we could

put Brad in this device and use this

contour process and we could stipple on

the phosphorescent makeup and put him

under the black lights and we could in

fact scan him in real time performing

Ekman’s facts poses right so effectively

we ended up with a 3d database of

everything Brad Pitt’s face is capable

of doing so from there we actually

carved up those faces into smaller

pieces and components of his face so we

ended up with literally thousands and

thousands and thousands of shapes a

complete database of all possibilities

that his face is capable of of doing now

that’s great except we had him at age 44

we need to put another 40 years on him

at this point we brought in Rick Baker

and Rick’s one of the great makeup and

special effects viewers of our industry

and and we also brought in a gentleman

named kazoo soo-ji and cause of soo-ji

is one of the great photo real sculptors

of our time and we commissioned them to

make a maquette or a bust of a Benjamin

and so in the spirit of the great

unveiling I had to do this I had to I

had to unveil something so this has been

a tea right now with we created three of

these there’s been a T there’s been 70

there’s been 60 and this really became

the the template of moving forward now

this was made from a life cast of Brad

so in fact anatomically it is correct

the eyes the jaw the teeth everything is

in perfect alignment with what the real

guy has we have these maquettes scanned

in

computer at at very high resolution an

enormous polygonal count and so now we

had three age increments of Benjamin in

the computer but we needed to get a

database of him doing more than that so

we went through this process then called

retargeting so this is Brad doing one of

the ecumene facts poses and here’s the

resulting data that comes from that of

the model that comes from that and

retargeting is the process of

transposing that data onto another model

and because the lifecast are the bust

the maquette of Benjamin was made from

Brad we could transpose the data of Brad

at forty four on to Brad at eighty seven

so now we had a 3d database of

everything Brad Pitt’s face can do at

age 87 in his 70s and then in his 60s

next we had to go into the shooting

process so while that’s going on we’re

down in New Orleans and locations around

the world and we shot our body actors

and we shot them wearing blue hoods so

these are the gentleman who played

Benjamin and the blue hoods helped us

for two things one we could easily erase

their heads and we also put tracking

markers on their heads so in fact we

could recreate the camera motion of the

lens optics from the set but now we

needed to get Brad’s performance to

drive our virtual Benjamin and so we

edited the footage that was shot on

location with the rest of the cast and

the the body actors and about six months

later we brought Brad onto a soundstage

in Los Angeles and he watched on the

screen and his job then was to become

Benjamin and so we looped the scenes he

watched again and again we encourage him

to improvise and he took Benjamin in

interesting unusual places that we

didn’t think he was going to go we shot

him a four HD camera so we get multiple

views of him and then David would choose

the take of Brad being Benjamin that he

thought best matched the footage with

the rest of the cast and from there we

went into a process called image

analysis and so here you can see again

the chosen take and we are seeing now

that data being transposed on to Ben

eighty-seven and so what’s interesting

about this is we use something called

image analysis which is taking timings

from different components of Benjamin’s

face so we could choose say his left

eyebrow and the software would tell us

that well in frame 14 the left eyebrow

begins to move from here to here it

concludes moving in frame 32 and so we

could choose

members of positions on the face to pull

that data from and then the sauce I

talked about with our technologies to

that secret sauce was effectively

software that allowed us to match the

performance footage of Brad in

live-action with our database of aged

Benjamin the the fact shapes that we had

and on a frame-by-frame basis we can

actually reconstruct a 3d head that

exactly matched the performance of Brad

so this is how the finished shot

appeared in the film and here you can

see the body actor and then this is what

we called the dead head no reference to

Jerry Garcia and then here’s the

reconstructed performance now with the

timings of the performance and then

again the final shot it was a long

process

the neck next section here I’m gonna

just blast through this because we could

do it a whole TED talk on the next you

know several slides we had to create a

lighting system so really a big part of

our processes from creating a lighting

environment for every single location

that Benjamin had to appear so that we

could put Ben’s head into any scene and

what is exactly matched the lighting

that’s on the other actors in the real

world we also had to create an eye

system we found the the old adage you

know the eyes of the window to the soul

is absolutely true so the goal here was

to keep everybody looking in Ben’s eyes

and if you could feel the warmth and

feel the humanity and feel his intent

coming through the eyes then we would

succeed so we had one person focused on

the eye system for almost two full years

we also had to create a mouth system we

worked from dental molds of Bradway they

aged the teeth over time we also had to

create an articulating tongue that

allowed him to enunciate his word so

there’s a whole system and software

written to articulate the tongue we had

one person devoted the tongue for about

nine months he was he was very popular

skin displacement another big deal the

skin had to be absolutely accurate and

he’s also in an old-age home he’s in a

nursing home around other old people so

he had to look exactly the same as the

others so lots of work on skin

deformation you can see in some of these

cases it works in some cases it looks

bad this is a very very very early test

in our process so effectively we created

a digital puppet that Brad Pitt could

operate with his own face there were no

animators necessary to come in and

interpret behavior or enhance his

performance there was something we

encountered though that we end up kind

of calling the digital Botox effect so

as things kind of went through this

process it did kind of Fincher would

always say it’s and blasts the edges off

of the performance and one thing that

our our process and the technology

couldn’t do is it couldn’t understand

intent the intent of the actor so it

sees a smile as a smile it doesn’t

recognize an ironic smile or a happy

smile or frustrated smile so did take

humans to kind of push it that one way

or the other but that we ended up

calling the entire process and all the

technology emotion capture as opposed to

just motion capture so take another look

Wow

I heard mama and tizzy whispering they

said I was gonna die soon but maybe not

well I heard mama and tizzy whispering I

said I was gonna die soon but

maybe not

well I heard mama and tizzy whispering I

said I was gonna die soon but maybe not

that’s how to create a digital human in

18 minutes

a couple of quick factoids it really

took 155 people over two years and we

didn’t even talk about 60 hairstyles and

an all-digital haircut but that is

Benjamin thank you