Interact with any device using your mind
[Music]
this is my cousin sam
sam was born completely paralyzed
he’s unable to speak or controllably
move
because of severe cerebral palsy
he’s completely dependent on his parents
and caregivers
for everything while being fully lucid
i want you to imagine what it might be
like
to be completely dependent on others yet
to not be able to express yourself
and to be unable to interact with the
world around you
this is what i was imagining at my first
tedx auckland some years back
under the punchy slogan of make
happen we the audience were asked
to complete a sentence if i ran the
world
something something when i was done
my answer card read allow
paralyzed children to have equal
opportunity in the world
and the second card underneath said
create technology that allows them
to speak at this point
this was a big dream with absolutely
zero support for execution
luckily just days after that event i
came across a video demonstration of a
device
that could read the electrical activity
of the brain
and with some training allowed a person
to move a box on the screen
using their brain activity now it seemed
like there could be a way to get to that
dream
the initial idea went something like
this
you take that device or similar
technology
to then bypass the limitations of
physical disability
and directly access what’s in the
person’s mind
so they can control a computer and
communicate
plus there was a bonus and this might
surprise
some of you everybody has a brain
so if this worked for someone like my
cousin sam
then anybody could also communicate
and interact using their mind
in academia and research this is called
brain computer interfacing
and outside of those more recently it’s
also been referred to as brain sensing
technology
here is a very quick overview the basic
premise is to
observe the activity of the brain and
then interpret it
into control commands or other inputs
for computer systems
these observations can be done either
invasively
by surgically inserting electrodes into
the brain to detect the electrical
activity of neurons the brain cells
or non-invasively using different brain
imaging and monitoring techniques
for example functional mri that uses
huge magnets
in room-sized machines to look at
changes in
blood flow that are associated with
brain activity
or electro-interferography eeg that
looks at the electrical activity of the
brain using an array
of sensors that is placed on the outside
of the scalp
to me this seemed like an easier option
comparing to brain surgery or
multi-million dollar mri scanners
especially if we could get a less
conspicuous
device
fast forward a couple of years and i and
several people much smarter than me
started a company called thoughtwired
at thoughtwired we use this kind of
technology to create a non-invasive
commercially available brain computer
interface for everyone
the ultimate goal is to have an
interface that will enable
anyone to intuitively interact with any
device
using their mind
last august years of our research and
development
culminated in launching our first
product now’s blink
now’s blink is a wearable that
translates eye movement signals
into actions while sensing brain
activity
a combination of an advanced biosensor
and our unique machine learning
algorithms
allows anyone to control a computer by
blinking their eyes
with it people like sam can type out
text browse the internet
operate smart home devices basically do
anything that can be done
with a click of a button
one of these people is daniel a vibrant
creative 27 year old
danielle just like sam lives with severe
cerebral palsy
before now’s blank she could only
communicate yes by looking up
and no by looking down she had tried a
number of
different types of assistive access and
communication technologies but none
worked for her needs during development
of naos we spent
a year co-designing the system with
danielle
and her support network this process
ensured that what we were building
suited her needs and the needs of those
with similar conditions
now danielle uses now on almost daily
basis to play games
communicate complete puzzles create
artwork
even take sneaky photos of her dad and
friends while they’re snoozing
it is the first time she’s been able to
independently communicate
in her life those of you
paying attention now may be asking
you are building a brain interface why
does your first product work using eye
blinks
the answer is electricity and
practicality
our bodies produce different kinds of
electrical activity
there’s brain activity muscle
activations even eye movements have
their own signals
and all of these can be detected using
sensors
let me demonstrate
excellent i did bring my brain perfect
the headband that i’m wearing is
actually a biosensor
it is manufactured by our partner and it
is the part of
the now system on the screen
you can see the signals the sensor is
picking up
from my forehead it is the combination
of my brain activity
eye movements and muscle activations
this bar chart is the breakdown of that
activity into frequency bands
and each band correlates to certain
types of brain activity and states
and we’ll come back to that in a little
bit and if you look at the line graph
you will see spikes there you go
like that that is me blinking my eyes
during our work on translating brain
activity into computer controls
we also created technology that can
detect these eiplinks
and more importantly differentiate
between the
natural automatic blinks and the
intentional purposeful ones
we take the signal data and transform it
into commands for controlling computers
this blink detection capability was
ready to be shipped before anything else
that we’ve built and so
we put it to use for people like
danielle
meaningful outcomes for users are far
more important than how capable or cool
your technology is
now’s blink systems have already been
deployed in eight countries around the
world
with individuals like danielle and
organizations that work with people with
disabilities
and as we speak finally my cousin sam
and his parents are learning how to use
their own
nail splint system
but now’s blink is just the first step
connecting the brain and computers is a
very hard problem
you need large amounts of dynamic brain
data
captured outside of controlled
environments like research labs
so everything that our team is learning
and the signal data that
our technology is accumulating from
now’s blink users
is feeding into our ongoing development
process of that ultimate brain interface
as we work towards that vision we are
guided by the principle that if you
start designing with people who
have very specific urgent needs
then you are likely to solve problems
for much larger populations
in the long run some of the technologies
and products
that are used by billions today
were originally designed to address very
specific challenges
the typewriter was initially used by
people who were blind to write letters
and alexander graham bell developed the
telephone to help his work
supporting people who were deaf
when we empathize with others we create
things that otherwise we might have
never created ourselves we see past the
specifics of what we know
to create experiences that can actually
be universal
by following this process we created a
solution that is already changing
people’s lives and obviously we’re not
done there are thousands of people like
sam and danielle who need
our system so our immediate focus is to
make sure it gets out to them
but at the same time we’re already
thinking how we can help
even larger populations of people we’re
considering a number of applications
in areas like cognitive assessments
therapy and sleep
cognitive assessments rely on ability to
communicate
which people who need those assessments
often lag due to trauma or disability
today mental health professionals rely
on subjective surveys
and observations to assess whether their
interventions are working
these methods are notoriously biased
uninformative
and inaccurate and in sleep health
people with
chronic disorders have to go to sleep
labs to be wired up to uncomfortable
devices
to be diagnosed and treated
with the type of wearable sensors and
machine learning algorithms that we use
it is possible to observe levels of
attention
stress cognitive load and many others
we can do that by analyzing the
composition of brain signals
those different frequency bands that i
showed earlier
and we already have the technological
foundation to do that
by building on it will be able to
provide accurate measures of
emotional state and cognitive function
real-time feedback on interventions
and progress to recovery or improvement
i’m especially excited by the
combination of the work that we’ve
already completed with these new
directions it will unlock access to
treatments and services for people with
severe disabilities who struggle to
communicate
among all of the other users this talk
started with
imagining and dreaming so let’s wrap it
up in a similar way
imagine having your smart device tell
you hey you seem to be
quite stressed and your attention isn’t
so great at the moment
so maybe let’s take a break go for a
walk and then you can come back to this
important piece of work that you need to
finish we can do this by analyzing those
same kinds of brain data
that we have to analyze for therapeutic
outcomes we will be able to
anticipate and predict user’s mental
states similarly to how google maps at
the moment suggests
different destinations based on your
history
then there are specific responses in the
brain that we observe
or experience for example it’s been
demonstrated that
detected can be distinct signals in
brain why error
is observed most of you just produced
those signals
while i did my best to keep the sentence
distorted for you
basically our brains spot errors in
language within milliseconds of
observation
you could see how plugging this into a
this detection
into a smart device could eliminate the
need to
ever press backspace again
from there we enter the territory of
superpowers
researchers around the world have been
achieving incredible results
extracting speech and images from brain
activity
letting people control exoskeletons and
robotic prosthetics
using their mind even proving that
technological telepathy
brain-to-brain communication is possible
these examples are strictly lab results
in most cases achieved with surgically
implanted electrodes
this means that unfortunately none of us
are going to be getting a
mind controlled exoskeleton for this
christmas or that we’ll be
telling campfire stories telepathically
during the holidays
but it gives us a glimpse of
scientifically proven
possibility and it is our job to now
turn these possibilities
into reality we have a working platform
to get us there
and key principles to guide us remember
to start with practicality before
coolness of technology
think of the non-average users because
the average
doesn’t exist and design with people
and not four people
imagine a future where technology
understands
and anticipates your needs and helps you
live
healthier more fulfilling life
imagine a world that is universally
accessible
to everyone at the speed of thought
this is the world that i and our team
are building and we want to empower not
just people like my cousin sam
or even me and you but everyone
to live healthier happier lives full of
independence possibility and superpowers
thank you