The Power and Pitfalls of the Mind
Transcriber: Nguyen Phuong Nga
Reviewer: Rhonda Jacobs
So the theme that we’ve been asked
to speak about for these TED talks
is the bigger question.
Now for me when I think of
the bigger question in life,
what normally comes to mind is
how do we know when we’ve got
the right conclusion, perspective or idea?
Now, when you really think about it,
there’s so many brilliant people,
brilliant minds,
who are on often completely
different ends of the spectrum
when it comes to their conclusions
on certain topics.
So then how can we be sure
that we’re on the right side
and that we’re seeing things
in the accurate way?
And then when you take that idea further
and you think about
where our thoughts come from
and how so many of our thoughts
and actions are autonomous or automatic,
how can we then say that our life
is a product of free will
or is it simply determinism?
And so these are some of the ideas
I’d love to explore in the coming minutes.
Now, for me, growing up,
it was a bit of a unique circumstance.
I was raised exclusively by women.
My parents were separated,
and so it was my three sisters
and nine female cousins,
all of which were very strong women.
Now, as any youngest child knows,
your older siblings
are always so influential over you.
And for me it was no different.
I was surrounded by sisters
who were very strong minded
and each of them
had their own perspectives.
Add to the fact I had a mother
from a different culture
in that she moved from Pakistan to the UK.
And then obviously the culture
that I grew up in was in the UK.
So I had a lot of ideas and I guess
contrasting viewpoints of the world,
which I was trying to make sense of
and reconcile from a young age.
What my sisters also introduced me to
was a whole host of brilliant books
and within there psychology
was always one of my favorite topics
to read and research.
Now, one of my favorite books
is called Thinking Fast and Slow.
It’s by an author called Daniel Kahneman.
And in this book
he talks about the idea of
two systems of thinking
that work within our mind.
Now, System 1 is more of the autonomous
or the automatic thinking
that’s taking place all the time.
It really has its place
in our prehistoric early evolution mind,
and it’s really there
because it allows us to be instinctual.
So if you can imagine being in a scenario
where you’re on the horizon
and you see a predator of some type,
the ability to immediately recognize
a threat and to know to act and run away
or take some kind of action,
that would be your system one thinking.
It’s the autonomous part,
the automatic part,
and it requires very little energy.
It’s working all the time.
It’s helping you to give you
the instincts and feelings.
Now, when these instincts and feelings
are given over to System 2,
and System 2 confirms that,
that’s when that become a belief.
So your System 2
is more of your conscious brain.
It’s where you’re taking ideas
with more complex information,
you’re reviewing that,
and you’re consciously thinking
through more complex scenarios.
So System 2 is a lot slower,
and it’s more taxing,
but it’s more accurate in perhaps
what it might give across as feedback.
Now, most of the time System 1
and System 2 work together.
And they work very well.
However, System 1,
because it works fast,
it is prone to what you call
cognitive bias.
And this is the part of the mind
that we could look at as the pitfalls.
Now, when we think about cognitive bias,
essentially cognitive bias
is our mind, in a sense,
having a blind spot in its thinking,
almost lying to ourselves -
it’s an illusion that’s created within
our thinking that we’re not aware of.
And that’s really where the danger is.
It’s because we’re not aware
of these cognitive biases
that it limits our perspectives,
reasonings and conclusions.
Now, there’s a whole host
of cognitive biases that I could go into.
But let’s focus on perhaps
what’s one of the most common ones
which is confirmation bias.
Now, we see confirmation bias
happening all the time.
Essentially when confirmation
bias is taking place,
the scenario is that you have a belief,
and as you go to look at the evidence
or you hear the evidence
or the information,
your mind will automatically ignore
the things which are contradictory
to the belief that you already hold,
but it will only pick up on
what will confirm the belief.
So even if the belief you have is untrue,
and you’re presented with the information,
your mind might simply
ignore that information
and only pick up on what is going
to confirm that idea.
Now, I’m sure we can all remember
having a conversation with someone
where no matter how much
information we put across,
it just didn’t seem to resonate.
And that’s confirmation bias at play.
Now, where confirmation bias, I think,
becomes even more exaggerated
is in this digital age
that we live in right now,
where different social media sites
and places like YouTube
work on algorithms that will feed you
more of what you’re looking for.
So essentially, you’ve got a confirmation
bias is working in your mind,
and then you’ve got these algorithms
that are feeding you more of the same
of what you’re already looking at.
And to me, the two things combined
will just create an echo chamber
where you’re really just being exposed
to one side of a situation repeatedly.
And you can imagine the dangers with that.
Now, if we look further into this idea
of confirmation bias
and how it relates to
how we relate to one another,
there’s an interesting experiment
that was done in 1920
by Edward Thorndike,
an American psychologist.
He looked at how officers
were taking in soldiers
as far as their perception of them,
and he discovered
what he called “the halo effect.”
So essentially the halo effect,
what it means is that
the officer’s first impression
of the soldier would hold
no matter how the soldiers were to behave
after that first impression.
So if the first impression was good,
even if the soldiers were generally
not behaving in the best way afterwards,
the officer would still
have a good impression of them,
and vice versa, if the first impression
was bad and so on.
So firstly, that really proves the idea
that you don’t get a second chance
to make a first impression.
But more so it just alerts us to this idea
of the fallibility of our mind
and how we’re able to form
these confirmation biases
even if they aren’t based in reality.
That becomes even worse
when you think about it from stereotyping.
So if we’re meeting people,
often what’s happening
when people are forming stereotypes,
if that person in front of you
is showing you nine traits out of 10
that go against the stereotype,
but they show you one trait
that is in line with that,
many of our minds might
still only pick up on the one thing
and reinforce that stereotype.
So you can see how
this could be dangerous
in how we relate to one another.
But where I think
it becomes really harrowing
as far as cognitive bias
and if we look at historical examples
is when you look at a 1940s Germany.
To me, this is the worst example of how
cognitive biases can lead us astray.
So what made ordinary Germans
commit such horrific war crimes
as what we saw in the Second World War?
Now, from the research, it would appear
that two main factors stand out.
One is that they surrendered their own
thinking to that of a leadership figure.
Again, this is another
cognitive human bias.
And the second one is conformity bias,
which is the desire
to want to fit into the group.
A lot of research points to these elements
as an explanation for that behavior.
One of the studies in particular
was looking at 210 members
of the Reserve Police Battalion 101.
And what it found
was that 90 percent of them
had committed murder
by the end of the war.
Now, these were the same people
who prior to the war
had said that the same acts
would have been disgusting to them.
They never would have imagined
doing such a thing.
So psychopathology was ruled out.
And also simply blaming the fear
of not acting was also ruled out
because in many cases they weren’t ordered
to commit a lot of these murders.
So it really points to this idea
of a desire to want to fit into the group
and not wanting to be ostracized
or isolated as a driver
for such horrific behavior.
So really what this says to me
is that it’s not enough
to simply intend to be a good person
or to have good intentions.
But I think to be a good person,
you have to have an understanding
of your own mind,
self-awareness and critical thinking
so that you’re able to separate yourself
from your conditioning,
your fears and your cognitive biases.
So other than having an understanding
of what these biases are
and then reflecting on them to be able
to minimize their impact on your life,
another way to overcome them
is to utilize mental models.
Now, mental models is an idea
that came to the forefront
by largely Charlie Munger, who’s
the investment partner of Warren Buffett.
He spoke about this quite a lot
in a number of speeches,
but the idea of what mental models are
is essentially used
by a lot of us in everyday life
without necessarily
calling it mental models.
Essentially a mental model is a rule
or a framework for your thinking.
And it’s powerful for many reasons.
It helps you to have complex thought
in a much faster way,
but it also helps you to overcome
certain cognitive biases.
For example, Jeff Bezos
speaks about this idea
of using a regret minimization
framework when trying to decide
if he should leave his job
and start Amazon.
And essentially what he did was
imagined himself as an
80- or 90-year-old version of himself,
looking back at that moment
of his life and trying to decide
if he would regret having made
that decision or not,
and that essentially freed him up
to make the decision.
And of course, we know where that led.
I’ve definitely applied
a similar approach,
and I think what I also do
a lot is reframe a situation.
When I was starting my business,
I was well aware
that because of situations
like COVID last year
and the 2008 financial crisis,
there’s a lot of aspects of life
that are outside of our control.
So you could put together the perfect
business plan, do everything right,
and you could still fail
out of no control of your own.
Now, knowing that, I reframed
my leap of faith to start my business
by simply saying,
well, if I take that leap,
knowing how difficult it can be,
and if I genuinely believe
that I’ve applied myself
in the best way I possibly could,
that for me is success in and of itself.
And that reframing of a situation
essentially freed me up
to not have that anxiety and to go forward
and do it with peace of mind.
So mental models can work
in many different circumstances
and can help us to overcome
these shortfalls in our thinking.
First Principles is another great
mentor model is used by Elon Musk
and essentially is an approach of taking
a system, keeping the key principles,
but breaking the rest of the system
down into components
and challenging every assumption
within that system
whilst maintaining the key principles.
Now, when you rebuild that
by looking at how every one
of those things could be better,
you essentially end up with hopefully
a more efficient or a better system.
And again, Elon Musk,
as successful as he is,
has quoted this idea, this mental model,
several times in his approach.
I definitely try to use the same thing
in what I do as well.
But I think we’re mental models
becomes the most powerful
is when you do what’s called
a latticework of mental models.
It’s when you’re taking
several mental models
and you’re applying them
together to get objectivity,
to get like a whole
or holistic view of a situation.
Now, as an example, let’s take a forest.
If I was to speak to an ecologist
or a botanist and say,
“Give me your perspective on the forest,”
they’d probably notice the quality
of the leaves, the soil, the plants.
If I speak to an environmentalist,
they’d probably notice the impact
of the forest on the environment.
If I speak to a business person,
they might notice its value
and it’s appreciating asset.
Now, none of these people would be wrong,
but each one of them would be seeing it
through their subjective lens,
as we all do.
But by applying all of these models
to the viewpoint of the forest,
that’s when you get a whole idea
of how to manage the forest
and how to get a truly objective sense
of that particular environment.
And I think this is where
mental models can really help you
to not have this one-sided -
as we said earlier,
that echo chamber effect
that the digital world is creating.
And it really becomes powerful
in giving you a truthful, objective,
well-rounded sense of any situation.
And so ultimately,
what we’re trying to say
or what I’m trying to say in in this
conversation with you is
the mind is a really powerful thing.
I think it has incredible potential.
It also has its pitfalls.
I think knowledge of that
can be a very positive thing.
The most positive aspect of that
for me personally
is knowing that we all have
these tendencies to make mistakes
in how we interpret the world
without knowing
means that when I encounter somebody
who has a vastly different
opinion to myself
or even in somebody who might have
potentially done wrong by me in a scenario
I find it very difficult
to hate that person
because I don’t view that as an entirely
conscious intentional behavior
or thought process.
And I think there’s something there
that we can all potentially apply
to how we relate to one another.
I think if you take a look
in social media these days,
there’s so much venom and animosity
about anybody who has
a disagreeing viewpoint to you.
But I think when we look at it
as the power and the pitfall of the mind,
we can only have a greater compassion.
And also we can leverage the same ideas
to separate our true selves
from our conditioning,
our fears and our biases.
And I think once we’re able to do that,
that’s when we can truly say
that we’re living a life of free will
and not determinism.
Thank you very much for listening.
I will provide all of the resources
of what I’ve been reading
that led up to this
conversation or this speech,
and hopefully look forward
to hearing from you.
And I hope you enjoyed listening to this.
Thank you very much.