Your Algorithm Will See You Now

Transcriber: Petra Molnárová
Reviewer: Hani Eldalees

So today, your algorithm
will see you now

and what do I mean by that?

Well, I’m going to get into it.

So, as a dermatologist,
every day in the clinic,

we will see skin like this.

I’ll walk in and see a patient

and their back covered
with all these different spots.

As a dermatologist, I know to look
at some of the more important ones.

Why did this spot stand out more to me
then all of the rest on there?

And why did this jump out
to me as being something

like a melanoma that I
would want to biopsy?

Well, that is the importance of vision.

It’s also the importance
of decades of training,

thousands of hours of experience

and real-world patient
context that allows you

to come to that diagnosis
as a dermatologist.

Now, I’d like to talk about how
important these skin cancers are

and how important all skin cancers
and all cancers are.

On average, per year, we have about
5.4 million non-melanoma skin cancers

that are diagnosed

and that comes out to about one
skin cancer every five seconds.

So what I’d like to discuss, is how we
can leverage ubiquitous technology,

that we have all around us,

to help come to a higher
fidelity diagnosis,

a better rendering of care for patients
and a higher fidelity health care.

Specifically, this ubiquitous
technology that we’re talking about -

things like your iPhone,
your camera, smartphone,

even your Apple Watch
and other data sensor metrics,

we can use now this data technology

to get a better conclusive
diagnosis in health care

using artificial intelligence.

Now, before we jump into all
of the computer science and the fun stuff,

I want to do something that might
haven’t been done in a TED talk before.

A lot of people listen to these remotely,

they listen to a podcast, and I’d like you
to, at this moment, safely pull over

on the road or pull over from your workout
and take a look at your phone

or whatever streaming device you’re using,

we’re doing a little pop quiz.

This pop quiz is specifically a guess.

Who am I?

Who am I specifically talking
about right now?

And whenever the answer comes to
your mind, just blurt it out.

It’s OK if you’re bothering
whoever’s next to you,

they’ll probably like it and ask
about the talk in and of itself.

So, I am a male.

I have a beard.

I’m tall.

I like speeches.

I was born in a log cabin.

And I was a former president of the USA.

If you’ve come to a conclusion, again,
feel free to blurt that one out.

And that’s Abraham Lincoln over here.

Those pieces of data are all siloed off
into the individual components,

like the height, the description,
the beard, all of these things,

and it takes a little time
to tease out that data,

to see exactly what we’re talking about.

Now, leveraging instead of those

individual natural language data points
or individual data points,

if we then leverage visual data,

you now get to the second
question of this pop quiz,

which is, who is this?

And the answer would be
the greatest player in NBA history,

Kobe Bryant, and it’s a lot easier
to come to conclusions like that

based on visual data than individual
components of data points.

And I’d like to discuss today,
how we can do that same thing

in dermatology and in health care,

to get a much better diagnosis

and really leverage a lot more
data points for our patients.

So, specifically, when patients
come in today,

they’ve got all different categories
like their blood work,

which medications they’re on, various
electronic medical records,

and then all sorts of other data points
from a physical exam and paperwork.

Those are sort of individual components

that aren’t really brought together
in a way that we can do yet.

But we’re getting very close.

We’re getting close thanks to the use

of machine learning,
artificial intelligence,

and really computer vision

to come to those integrated diagnoses,
and integrated decision platforms

a lot more quickly with higher fidelity.

So, again, we’re talking today
about leveraging visual data points

to come to a better conclusion,
to diagnose things like cancer,

heart attacks, strokes,

very important health care
issues for the whole world.

So, again, we’re moving from what I like
to call artificial intelligence

into more of a clinical workflow of what
I like to coin as augmented intelligence.

That’s really bringing the data points
of artificial intelligence

into an augmented decision platform
with both patients and physicians,

rendering that as not just
its own small piece,

but really a conclusion that
they come to together.

So we’re moving from health care
to augmented health care,

really augmenting the ability to diagnose,
treat and manage.

I think a great way to think about
this is what’s been done with Tesla.

Look at Tesla, done phenomenally well
in artificial intelligence

and edge mapping, machine learning,
vision mapping, computer vision.

This is an actual footage

of Tesla’s autonomous vehicle
in real time in Silicon Valley,

driving around the streets.

Obviously, this isn’t fully autonomous.

This is working in conjunction
with the driver themselves,

you can see that it maps out
all these data points.

We’re trying to do the same thing
with data points in health care.

So how do they do that?

Lots of really great HD cameras

that are brilliantly integrated
in their vehicles.

So, how can we take some
of those high definition images

we see in dermatology
on patient’s skin, on human skin,

and really integrate that in, to do this
with machine learning in health care?

So, what a great research team
did up at Stanford,

was they took over 100,000
biopsy proven clinical images

and over 2,000 different
disease classifications

and they put that into

a convolutional neural network
through a training set model.

We won’t get into the details
of the computer science on it,

but after training that
model back and forth,

it ultimately comes to a conclusion
when being quizzed on a new lesion

or a new image, and it
renders a diagnosis,

whether that’s a melanoma,
a dysplastic nevus or whatever it may be.

So, again,

they took that now trained artificial
intelligence machine learning model

and they quizzed it on all
sorts of different images

and different kinds of cutaneous findings
that might or might not be cancerous.

This is sort of what the computer sees.

You see this sort of static edge mapping
outline on some of these lesions,

as well as this heat map
kind of rendering

what would be a concerning finding

versus what would be
a little bit more benign.

And this is real time.

A demo of this happening on a cell phone,

looking at these images
and actually scanning them

with reasonable fidelity to see whether
or not this is something concerning,

like a melanoma

or it’s something less concerning,
like a benign cutaneous growth.

They quiz that model on carcinomas

against board certified dermatologists
working together.

They found the algorithm had
about a 96% accurate diagnosis

for those carcinomas,

and about a 94% accurate
diagnosis for melanomas,

which is pretty incredible

that we’re able to do that
at this young stage

in artificial intelligence in health care.

Even more impressive is another group
back over at MIT and Harvard

that did a collaborative study taking
instead of those individual lesions,

looking at a full total body photography
or more of a global picture,

that would be more practical,
like the image at the start,

where you’re actually looking
at the global picture of a patient

and identifying which one of those points,

which one of those lesions are
more concerning than others.

And it’s really the start
of this technology.

I think this is extremely cool.

This can work in anything in health care,

not just dermatology, not just pathology,

it’s really anything that’s
visual at this point.

Hopefully, we can move into something
a bit more natural language processing

in the near future,

but this is allowing us to augment
dermatopathology as well,

with some of the teams over at Google
doing an incredible job,

really rendering those outlines of what
are concerning features under histology,

and even another group over at Stanford
doing this for X-rays as well,

for identifying this again directly
on the cell phone,

using these algorithms
to not just render a diagnosis,

but to augment what we can already do

with well-trained, brilliant physicians
and scientists taking care of patients.

So with this significant
increase in technology,

we now have our global patient,
our global human.

We’re all patients at the end of the day,

if we’re lucky enough
to make it through there.

We’ve got different data points,
whether it’s your Apple Watch,

your sleep tracking data,
genomic data, exercise data,

all of these things can come
together in the near future

and render a much more
high fidelity diagnosis.

I don’t think we’re that far off

from having something as simple
as your FaceTime on your phone

or opening your phone with Lock Screen,

rendering somewhat of a diagnosis
you may or may not be experiencing

a stroke at this point in time
based on facial droop,

hemifacial paralysis or things like that,

and that could be the difference
between saving someone’s life

with ubiquitous technology.

So, I think we’re really
moving from something

that’s currently a granular
pixellated patient picture

to a much more high fidelity,

high resolution health care.

I’m really optimistic
about the future on here,

looking forward to continue
to build that.

I think in conclusion on here,

this is really the arc of innovation

and this is sort of an arc of innovation
in anything bringing back

to the analogy of self-driving vehicles
and vehicles in general.

So, with transportation,
basic transportation is like walking.

Basic health care is like a physical
examination, you look at the patient.

The next thing you develop
over that platform is a tool

like the horse and carriage,
or a stethoscope.

And then on top of that, we’ve got
great technology like cars or MRI’s

or genomic sequencing,
and then on top of that,

you have an infrastructure type
development like roads and highways

or in health care,
electronic medical records.

I think the last frontier
is what we’re starting to see

in that autonomous ride-sharing
kind of macroeconomic macro system change

where you’re changing the whole platform,
leveraging existing technologies

to then bring out the best components
of all of the above.

And I think in health care,

we can really leverage
augmented intelligence

to deliver the best possible patient care.

So in conclusion, will your
algorithm see you now?

I don’t think it will
and I don’t think it ever should

because driving cars
autonomously is not the same

as rendering a new cancer diagnosis

and working through that with a family,

with a loved one and really coming to
a shared decision-making platform.

This will never be something
that should be autonomous

but I am optimistic we’re going to live

in a day, very short from now, where
you can walk in front of your mirror

as you get ready in the AM,
brushing your teeth

and that mirror will tell you, ″well, hey,

there’s maybe a 9% chance of rain today.

Oh, by the way, that lesion
on the side of your neck -

possibly an 89% chance that’s melanoma.

Also, thanks to all of your
exercise and sleep physiology data,

you’ve decreased your risk for
a heart attack by about 10%″.

Now, this might seem like
it’s a long way away,

but long shots are never
as long as they seem.

Thank you for your attention.