The rise of humancomputer cooperation Shyam Sankar

I’d like to tell you about two games of

chess the first happened in 1997 which

Garry Kasparov a human lost a deep-blue

a machine to many this was the dawn of a

new era one where man would be dominated

by machine but here we are 20 years on

and the greatest change in how we relate

to computers is the iPad not how the

second game was a freestyle chess

tournament in 2005 and which man and

machine could enter together as partners

rather than adversaries if they so chose

at first the results were predictable

even a supercomputer was beaten by a

grandmaster with a relatively weak

laptop the surprise came at the end who

won not a grandmaster with the

supercomputer but actually two American

amateurs using three relatively weak

laptops their ability to coach and

manipulate their computers to deeply

explore specific positions effectively

counteracted the superior chess

knowledge of the grandmasters in the

superior computational power of other

adversaries this is an astonishing

result average men average machines

beating the best man the best machine

and anyways isn’t it supposed to be man

versus machine instead it’s about

cooperation and the right type of

cooperation we’ve been paying a lot of

attention to Marvin Minsky’s vision for

artificial intelligence over the last 50

years it’s a sexy vision for sure many

of embraces become the dominant school

of thought computer science but as we

enter the era of big data of network

systems of open platforms and embedded

technology I’d like to suggest it’s time

to reevaluate an alternative vision that

was actually developed around the same

time I’m talking about JCR Licklider z'

human-computer symbiosis perhaps better

termed intelligence augmentation I a

wick lighter was a computer science

Titan who had a profound effect on the

development of technology in the

Internet his vision was to enable man

and machine to cooperate in making

decisions in controlling complex

situations without the inflexible

dependence on predetermined programs

note that word cooperate Lickliter

encourages us

to take a toaster and make it data from

Star Trek but to take a human and make

her more capable humans are so amazing

how we think our nonlinear approaches

our creativity iterative hypotheses all

very difficult if possible all for

computers to do Lickliter intuitively

realized this contemplating humans

setting the goals formulating the

hypotheses determining the criteria and

performing the evaluation of course in

other ways humans are so limited were

terrible at scale computation and volume

we require high-end talent management to

keep the rock band together and playing

Licklider for saw computers doing all of

the routinize herbal work that was

required to prepare the way for insights

and decision-making silently without

much fanfare this approach has been

compiling victories beyond chess protein

folding a topic that shares the

incredible expansiveness of chess there

are more ways of folding a protein than

there are atoms in the universe this is

a world-changing problem with huge

implications for our ability to

understand and treat disease and for

this task supercomputer fueled brute

force simply isn’t enough fully a game

created by computer scientists

illustrates the value of the approach

non technical non biologists amateurs

play a video game in which they visually

arranged the structure of the protein

allowing the computer to manage the

atomic forces and interactions and

identify structural issues this approach

beats supercomputers 50% of the time and

tied 30% of the time folded recently

made a notable and major scientific

discovery by deciphering the structure

of the Mason Fischer monkey virus a

protease that had eluded determination

for ever 10 years was solved by three

players in a matter of days perhaps the

first major scientific advance to come

from playing the video game last year on

the side of the twin towers a 9/11

memorial opened it displays the names of

the thousands of victims using a

beautiful concept called meaningful

adjacency places the names and next to

each other based on the relationships to

one another friends families co-workers

when you put it all together it’s quite

a computational challenge 3,500 victims

1,800 to JCCC requests the importance of

the overall physical specifications in

the final aesthetics when first reported

by the media

full credit for such a feat was given to

an algorithm from the New York City

design firm local projects the truth is

a bit more nuanced while an algorithm

was used to develop the underlying

framework humans use that framework to

design the final result so in this case

a computer I’ve evaluated millions of

possible layouts manage a complex

relational system and kept track of a

very large set of measurements and

variables allowing the humans to focus

on design and compositional choices so

the more you look around you the more

you see Licklider vision everywhere

whether it’s augmented reality in your

iPhone or GPS in your car or

human-computer symbiosis is making us

more capable so if you want to improve

human-computer symbiosis what can you do

you can start by designing the human

into the process instead of thinking

about what a computer will do to solve

the problem design the solution around

what the human will do as well when you

do this you’ll quickly realize that you

spend all of your time on the interface

between man and machine specifically on

designing away the friction in the

interaction in fact this friction is

more important than the power of the man

or the power of the machine in

determining overall capability that’s

why two amateurs with a few laptops

handily beat a supercomputer and a

grandmaster what Kasparov calls

processes a byproduct of friction the

better the process the less the friction

and minimizing friction turns out to be

this isof variable or take another

example Big Data every interaction we

have in the world is recorded by an

ever-growing array of sensors your phone

credit card computer the result is big

data and it actually presents us with an

opportunity to more deeply understand

the human condition the major emphasis

of most approaches to big data focus on

how do I store this data how do I search

this data how do i process this data

these are necessary but insufficient

questions the imperative is not to

figure out how to compute but what to

compute how do you impose human

intuition on data at this scale again we

start by designing the human to the

process when PayPal was first starting

as a business their biggest challenge

was not how do I send money back and

forth online it was how do I do that

without being defrauded by organized

crime why so challenging because while

computers can learn to detect and

identify fraud based on patterns they

can’t learn to do that based on patterns

they’ve never seen before an organized

crime has a lot in common with this

audience brilliant people relentlessly

resourceful entrepreneurial spirit and

one huge and important difference

purpose and so while computers alone can

catch all about the cleverest fraudsters

catching the cleverest it’s the

difference between success and failure

there’s a whole class of problems like

this ones with adaptive adversaries they

rarely if ever present with the

repeatable pattern that’s discernable to

computers instead there’s some inherent

component of innovation or disruption

and increasingly these problems are

buried in big data for example terrorism

terrorists are always adapting in minor

and major ways to new circumstances

and despite what you might see on TV

these adaptations and the detection of

them are fundamentally human computers

don’t detect novel patterns or new

behaviors or humans do humans using

technology testing hypotheses searching

for insight by asking machines to do

things for them Osama bin Laden was not

caught by artificial intelligence he was

caught by dedicated resourceful

brilliant people in partnerships with

various technologies as appealing as it

might sound you cannot algorithmically

data-mine your way to the answer there

is no find terrorists button and the

more data we integrate from a vast

variety of sources across a wide variety

of data formats from very disparate

systems the less effective data mining

can be instead people will have to look

at data and search for insight and as

Licklider foresaw long ago the key to

great results here is the right type of

cooperation and as Kasparov realized

that means minimizing friction at the

interface now this approach makes

possible things like combing through all

available data from very different

sources identifying key relationships

and putting of that in one place

something that’s been nearly impossible

to do before the salm this has

terrifying privacy and civil liberties

implications to others it foretells of

an era of greater privacy and civil

liberties protections but privacy and

civil liberties are of fundamental

importance that must be acknowledged and

they can’t be swept aside even with the

best of intense so let’s explore through

a couple of examples the impact that

technologies built to drive

human-computer symbiosis have had in

recent time in October 2007 US and

coalition forces raided an al Qaeda safe

house in the city of Sinjar on the

Syrian border of Iraq they found a

treasure trove of documents 700

biographical sketches of foreign

fighters these foreign fighters had left

their families in the Gulf the Levant in

North Africa

to join al-qaeda in Iraq these records

were human resource forms the foreign

fighters filled them out as they joined

the organization it turns out that

al-qaeda too is not without its

bureaucracy they answered questions like

who recruited you what’s your hometown

what occupation do you seek and that

last question a surprising insight was

revealed the vast majority of foreign

fighters were seeking to become suicide

bombers for martyrdom hugely important

since between 2003 and 2007 Iraq had

thirteen hundred and eighty two suicide

bombings a major source of instability

analyzing this data was hard the

originals were sheets of paper in Arabic

that had to be scanned and translated

the friction in the process did not

allow for meaningful results in an

operational timeframe using humans PDFs

and tenacity alone the researchers had

to lever up their human minds with

technology to dive deeper to explore

non-obvious hypotheses and in fact

insights emerged 20% of the foreign

fighters were from Libya 50% of those

from a single town in Libya hugely

important since prior statistics but

that figure at 3% it also helped to hone

in on a figure of rising importance in

Al Quaida Abu Yahya al-libi a senior

cleric in the Libyan Islamic fighting

group in March of 2007 he gave a speech

after worship was a surge and

participation amongst Libyan foreign

fighters perhaps most clever of all

though and least obvious by flipping the

data on its head the researchers were

able to deeply explore the coordination

networks in Syria that were ultimately

responsible for receiving and

transporting the foreign fighters to the

border these were networks of

mercenaries not ideologues who were in

the coordination business for profit for

example they charged Saudi foreign

fighters substantially more than Libyans

money that would have otherwise gone to

al Qaeda

perhaps the adversary would disrupt

their own network if they knew they were

cheating would-be jihadists in January

2010 a devastating 7.0 earthquake struck

Haiti third deadliest earthquake of all

time left 1 million people 10% of the

population homeless one seemingly small

aspect of the overall relief ever became

increasingly important as the delivery

of food and water soccer rolling January

and February the dry months in Haiti yet

many of the camps had developed standing

water the only institution with detailed

knowledge of Haiti’s floodplains

had been leveled in the earthquake

leadership inside so the question is

which camps are at risk

how many people in these camps what’s

the timeline for flooding and give it

very limited resources infrastructure

how do we prioritize the relocation the

data was incredibly disparate US Army

had detailed knowledge for only a small

section of the country there was data

online from a 2006 environmental risk

conference others geospatial data not

have been integrated the human goal here

was to identify camps for relocation

based on priority need the computer had

to integrate a vast amount of geospatial

information social media data and relief

organization information to answer this

question by implementing a superior

process what was otherwise a task for 40

people over three months became a simple

job for three people in 40 hours all

victories for human-computer symbiosis

were more than 50 years into lick

lighters vision for the future and the

data suggest that we should be quite

excited about tackling this century’s

hardest problems man and machine in

cooperation together thank you