The mission to create a searchable database of Earths surface Will Marshall

Four years ago, here at TED,

I announced Planet’s Mission 1:

to launch a fleet of satellites

that would image
the entire Earth, every day,

and to democratize access to it.

The problem we were trying
to solve was simple.

Satellite imagery you find online is old,
typically years old,

yet human activity was happening
on days and weeks and months,

and you can’t fix what you can’t see.

We wanted to give people the tools
to see that change and take action.

The beautiful Blue Marble image,
taken by the Apollo 17 astronauts in 1972

had helped humanity become aware
that we’re on a fragile planet.

And we wanted to take it
to the next level,

to give people the tools
to take action, to take care of it.

Well, after many
Apollo projects of our own,

launching the largest fleet
of satellites in human history,

we have reached our target.

Today, Planet images
the entire Earth, every single day.

Mission accomplished.

(Applause)

Thank you.

It’s taken 21 rocket launches –

this animation makes it look
really simple – it was not.

And we now have
over 200 satellites in orbit,

downlinking their data to 31 ground
stations we built around the planet.

In total, we get 1.5 million 29-megapixel
images of the Earth down each day.

And on any one location
of the Earth’s surface,

we now have on average
more than 500 images.

A deep stack of data,
documenting immense change.

And lots of people are using this imagery.

Agricultural companies are using it
to improve farmers' crop yields.

Consumer-mapping companies are using it
to improve the maps you find online.

Governments are using it
for border security

or helping with disaster response
after floods and fires and earthquakes.

And lots of NGOs are using it.

So, for tracking
and stopping deforestation.

Or helping to find the refugees
fleeing Myanmar.

Or to track all the activities
in the ongoing crisis in Syria,

holding all sides accountable.

And today, I’m pleased
to announce Planet stories.

Anyone can go online to planet.com

open an account and see
all of our imagery online.

It’s a bit like Google Earth,
except it’s up-to-date imagery,

and you can see back through time.

You can compare any two days

and see the dramatic changes
that happen around our planet.

Or you can create a time lapse
through the 500 images that we have

and see that change
dramatically over time.

And you can share these over social media.

It’s pretty cool.

(Applause)

Thank you.

We initially created this tool
for news journalists,

who wanted to get unbiased information
about world events.

But now we’ve opened it up
for anyone to use,

for nonprofit or personal uses.

And we hope it will give people the tools
to find and see the changes on the planet

and take action.

OK, so humanity now has this database
of information about the planet,

changing over time.

What’s our next mission, what’s Mission 2?

In short, it’s space plus AI.

What we’re doing
with artificial intelligence

is finding the objects
in all the satellite images.

The same AI tools that are used
to find cats in videos online

can also be used to find
information on our pictures.

So, imagine if you can say,
this is a ship, this is a tree,

this is a car, this is a road,
this is a building, this is a truck.

And if you could do that
for all of the millions of images

coming down per day,

then you basically create a database

of all the sizable objects
on the planet, every day.

And that database is searchable.

So that’s exactly what we’re doing.

Here’s a prototype, working on our API.

This is Beijing.

So, imagine if we wanted
to count the planes in the airport.

We select the airport,

and it finds the planes in today’s image,

and finds the planes
in the whole stack of images before it,

and then outputs this graph of all
the planes in Beijing airport over time.

Of course, you could do this
for all the airports around the world.

And let’s look here
in the port of Vancouver.

So, we would do the same,
but this time we would look for vessels.

So, we zoom in on Vancouver,
we select the area,

and we search for ships.

And it outputs where all the ships are.

Now, imagine how useful this would be
to people in coast guards

who are trying to track
and stop illegal fishing.

See, legal fishing vessels

transmit their locations
using AIS beacons.

But we frequently find ships
that are not doing that.

The pictures don’t lie.

And so, coast guards could use that

and go and find
those illegal fishing vessels.

And soon we’ll add
not just ships and planes

but all the other objects,

and we can output data feeds

of those locations
of all these objects over time

that can be integrated digitally
from people’s work flows.

In time, we could get
more sophisticated browsers

that people pull in
from different sources.

But ultimately, I can imagine us
abstracting out the imagery entirely

and just having a queryable
interface to the Earth.

Imagine if we could just ask,

“Hey, how many houses
are there in Pakistan?

Give me a plot of that versus time.”

“How many trees are there in the Amazon

and can you tell me the locations
of the trees that have been felled

between this week and last week?”

Wouldn’t that be great?

Well, that’s what
we’re trying to go towards,

and we call it “Queryable Earth.”

So, Planet’s Mission 1 was
to image the whole planet every day

and make it accessible.

Planet’s Mission 2 is to index
all the objects on the planet over time

and make it queryable.

Let me leave you with an analogy.

Google indexed what’s on the internet
and made it searchable.

Well, we’re indexing what’s on the Earth
and making it searchable.

Thank you very much.

(Applause)