Wearable tech that helps you navigate by touch Keith Kirkland
Do you remember your first kiss?
Or that time you burned
the roof of your mouth
on a hot slice of pizza?
What about playing tag
or duck, duck, goose as a child?
These are all instances where
we’re using touch to understand something.
And it’s the basis of haptic design.
“Haptic” means of or relating to
the sense of touch.
And we’ve all been using that
our entire lives.
I was working on my computer
when my friend,
seeing me hunched over typing,
walked over behind me.
She put her left thumb
into the left side of my lower back,
while reaching her right index finger
around to the front of my right shoulder.
Instinctively, I sat up straight.
In one quick and gentle gesture,
she had communicated
how to improve my posture.
The paper I was working on
at that very moment
centered around developing new ways
to teach movement using technology.
I wanted to create a suit
that could teach a person kung fu.
(Laughter)
But I had no idea how
to communicate movement
without an instructor being in the room.
And in that moment,
it became crystal clear: touch.
If I had vibrating motors
where she had placed each of her fingers,
paired with motion-capture data
of my current and optimal posture,
I could simulate the entire experience
without an instructor
needing to be in the room.
But there was still one important part
of the puzzle that was missing.
If I want you to raise your wrist
two inches off of your lap,
using vibration,
how do I tell you to do that?
Do I put a motor at the top of your wrist,
so you know to lift up?
Or do I put one
at the bottom of your wrist,
so it feels like you’re being pushed up?
There were no readily available answers
because there was no commonly
agreed-upon haptic language
to communicate information with.
So my cofounders and I
set out to create that language.
And the first device we built
was not a kung fu suit.
(Laughter)
But in a way, it was even more impressive
because of its simplicity and usefulness.
We started with the use case
of navigation,
which is a simplified form of movement.
We then created Wayband,
a wrist-wearable device that could
orient a user toward a destination,
using vibrating cues.
We would ask people to spin around
and to stop in a way that they felt
was the right way to go.
Informally, we tried this
with hundreds of people,
and most could figure it out
within about 15 seconds.
It was that intuitive.
Initially, we were just trying to get
people out of their phones
and back into the real world.
But the more we experimented,
the more we realized that those
who stood to benefit most from our work
were people who had little or no sight.
When we first approached
a blind organization, they told us,
“Don’t build a blind device.
Build a device that everyone can use
but that’s optimized
for the blind experience.”
We created our company WearWorks
with three guiding principles:
make cool stuff,
create the greatest impact we can
in our lifetimes
and reimagine an entire world
designed for touch.
And on November 5, 2017,
Wayband helped a person who was blind
run the first 15 miles
of the New York City Marathon
without any sighted assistance.
(Applause)
It didn’t get him through the entire race
due to the heavy rain,
but that didn’t matter.
(Laughter)
We had proved the point:
that it was possible to navigate
a complex route using only touch.
So, why touch?
The skin has an innate sensitivity
akin to the eyes' ability
to recognize millions of colors
or the ears' ability to recognize
complex pitch and tone.
Yet, as a communications channel,
it’s been largely relegated to
Morse code-like cell phone notifications.
If you were to suddenly receive
a kiss or a punch,
your reaction would be
instinctive and immediate.
Meanwhile, your brain would be playing
catch-up on the back end
to understand the details
of what just occurred.
And compared to instincts,
conscious thought is pretty slow.
But it’s a lightning bolt
compared to the snail’s pace
of language acquisition.
I spent a considerable amount of time
learning Spanish, Japanese,
German and currently Swedish,
with varying degrees of failure.
(Laughter)
But within those failures were kernels
of how different languages are organized.
That gave our team insight
into how to use the linguistic order
of well-established languages
as inspiration for
an entirely new haptic language,
one based purely on touch.
It also showed us when using language
mechanics wasn’t the best way
to deliver information.
In the same way a smile is a smile
across every culture,
what if there was some
underlying mechanism of touch
that transcended linguistic
and cultural boundaries?
A universal language, of sorts.
You see, I could give you
buzz-buzz-buzz, buzz-buzz,
and you would eventually learn
that that particular
vibration means “stop.”
But as haptic designers,
we challenged ourselves.
What would it be like to design “stop?”
Well, based on context,
most of us have the experience
of being in a vehicle
and having that vehicle stop suddenly,
along with our body’s reaction to it.
So if I wanted you to stop,
I could send you
a vibration pattern, sure.
Or, I could design a haptic experience
that just made stopping
feel like it was the right thing to do.
And that takes more than an arbitrary
assignment of haptic cues to meanings.
It takes a deep empathy.
It also takes the ability to distill
human experience into meaningful insights
and then into haptic
gestures and products.
Haptic design is going to expand
the human ability
to sense and respond to our environments,
both physical and virtual.
There’s a new frontier: touch.
And it has the power to change
how we all see the world around us.
Thank you.
(Applause)