Inside the bizarre world of internet trolls and propagandists Andrew Marantz
I spent the past three years
talking to some of the worst
people on the internet.
Now, if you’ve been online recently,
you may have noticed that there’s
a lot of toxic garbage out there:
racist memes, misogynist propaganda,
viral misinformation.
So I wanted to know
who was making this stuff.
I wanted to understand
how they were spreading it.
Ultimately, I wanted to know
what kind of impact
it might be having on our society.
So in 2016, I started tracing
some of these memes back to their source,
back to the people who were making them
or who were making them go viral.
I’d approach those people and say,
“Hey, I’m a journalist.
Can I come watch you do what you do?”
Now, often the response would be,
“Why in hell would I want to talk to
some low-t soy-boy
Brooklyn globalist Jew cuck
who’s in cahoots with the Democrat Party?”
(Laughter)
To which my response would be,
“Look, man, that’s only 57 percent true.”
(Laughter)
But often I got the opposite response.
“Yeah, sure, come on by.”
So that’s how I ended up
in the living room
of a social media propagandist
in Southern California.
He was a married white guy
in his late 30s.
He had a table in front of him
with a mug of coffee,
a laptop for tweeting,
a phone for texting
and an iPad for livestreaming
to Periscope and YouTube.
That was it.
And yet, with those tools,
he was able to propel his fringe,
noxious talking points
into the heart of
the American conversation.
For example, one of the days I was there,
a bomb had just exploded in New York,
and the guy accused of planting the bomb
had a Muslim-sounding name.
Now, to the propagandist in California,
this seemed like an opportunity,
because one of the things he wanted
was for the US to cut off
almost all immigration,
especially from Muslim-majority countries.
So he started livestreaming,
getting his followers
worked up into a frenzy
about how the open borders agenda
was going to kill us all
and asking them to tweet about this,
and use specific hashtags,
trying to get those hashtags trending.
And tweet they did –
hundreds and hundreds of tweets,
a lot of them featuring
images like this one.
So that’s George Soros.
He’s a Hungarian billionaire
and philanthropist,
and in the minds
of some conspiracists online,
George Soros is like
a globalist bogeyman,
one of a few elites who is secretly
manipulating all of global affairs.
Now, just to pause here:
if this idea sounds familiar to you,
that there are a few elites
who control the world
and a lot of them happen to be rich Jews,
that’s because it is one of the most
anti-Semitic tropes in existence.
I should also mention that the guy
in New York who planted that bomb,
he was an American citizen.
So whatever else was going on there,
immigration was not the main issue.
And the propagandist in California,
he understood all this.
He was a well-read guy.
He was actually a lawyer.
He knew the underlying facts,
but he also knew that facts
do not drive conversation online.
What drives conversation online
is emotion.
See, the original premise of social media
was that it was going
to bring us all together,
make the world more open
and tolerant and fair …
And it did some of that.
But the social media algorithms
have never been built
to distinguish between
what’s true or false,
what’s good or bad for society,
what’s prosocial and what’s antisocial.
That’s just not what those algorithms do.
A lot of what they do
is measure engagement:
clicks, comments, shares,
retweets, that kind of thing.
And if you want your content
to get engagement,
it has to spark emotion,
specifically, what behavioral scientists
call “high-arousal emotion.”
Now, “high arousal” doesn’t only
mean sexual arousal,
although it’s the internet,
obviously that works.
It means anything, positive or negative,
that gets people’s hearts pumping.
So I would sit with these propagandists,
not just the guy in California,
but dozens of them,
and I would watch as they did this
again and again successfully,
not because they were Russian hackers,
not because they were tech prodigies,
not because they had
unique political insights –
just because they understood
how social media worked,
and they were willing
to exploit it to their advantage.
Now, at first I was able to tell myself
this was a fringe phenomenon,
something that was
relegated to the internet.
But there’s really no separation anymore
between the internet and everything else.
This is an ad that ran
on multiple TV stations
during the 2018 congressional elections,
alleging with very little evidence
that one of the candidates
was in the pocket of
international manipulator George Soros,
who is awkwardly photoshopped here
next to stacks of cash.
This is a tweet from
the President of the United States,
alleging, again with no evidence,
that American politics is being
manipulated by George Soros.
This stuff that once seemed so shocking
and marginal and, frankly, just ignorable,
it’s now so normalized
that we hardly even notice it.
So I spent about
three years in this world.
I talked to a lot of people.
Some of them seemed to have
no core beliefs at all.
They just seemed to be betting,
perfectly rationally,
that if they wanted
to make some money online
or get some attention online,
they should just be
as outrageous as possible.
But I talked to other people
who were true ideologues.
And to be clear, their ideology
was not traditional conservatism.
These were people who wanted
to revoke female suffrage.
These were people who wanted
to go back to racial segregation.
Some of them wanted to do away
with democracy altogether.
Now, obviously these people
were not born believing these things.
They didn’t pick them up
in elementary school.
A lot of them, before they went
down some internet rabbit hole,
they had been libertarian
or they had been socialist
or they had been something else entirely.
So what was going on?
Well, I can’t generalize about every case,
but a lot of the people I spoke to,
they seem to have a combination
of a high IQ and a low EQ.
They seem to take comfort
in anonymous, online spaces
rather than connecting in the real world.
So often they would retreat
to these message boards
or these subreddits,
where their worst impulses
would be magnified.
They might start out saying
something just as a sick joke,
and then they would get so much
positive reinforcement for that joke,
so many meaningless
“internet points,” as they called it,
that they might start
believing their own joke.
I talked a lot with one young woman
who grew up in New Jersey,
and then after high school,
she moved to a new place
and suddenly she just felt
alienated and cut off
and started retreating into her phone.
She found some of these
spaces on the internet
where people would post
the most shocking, heinous things.
And she found this stuff
really off-putting
but also kind of engrossing,
kind of like she couldn’t
look away from it.
She started interacting with people
in these online spaces,
and they made her feel smart,
they made her feel validated.
She started feeling a sense of community,
started wondering if maybe
some of these shocking memes
might actually contain a kernel of truth.
A few months later, she was in a car
with some of her new internet friends
headed to Charlottesville, Virginia,
to march with torches
in the name of the white race.
She’d gone, in a few months,
from Obama supporter
to fully radicalized white supremacist.
Now, in her particular case,
she actually was able to find her way
out of the cult of white supremacy.
But a lot of the people
I spoke to were not.
And just to be clear:
I was never so convinced
that I had to find common ground
with every single person I spoke to
that I was willing to say,
“You know what, man,
you’re a fascist propagandist, I’m not,
whatever, let’s just hug it out,
all our differences will melt away.”
No, absolutely not.
But I did become convinced that we cannot
just look away from this stuff.
We have to try to understand it,
because only by understanding it
can we even start to inoculate
ourselves against it.
In my three years in this world,
I got a few nasty phone calls,
even some threats,
but it wasn’t a fraction of what
female journalists get on this beat.
And yeah, I am Jewish,
although, weirdly, a lot of the Nazis
couldn’t tell I was Jewish,
which I frankly just found
kind of disappointing.
(Laughter)
Seriously, like, your whole job
is being a professional anti-Semite.
Nothing about me
is tipping you off at all?
Nothing?
(Laughter)
This is not a secret.
My name is Andrew Marantz,
I write for “The New Yorker,”
my personality type
is like if a Seinfeld episode
was taped at the Park Slope Food Coop.
Nothing?
(Laughter)
Anyway, look – ultimately,
it would be nice
if there were, like, a simple formula:
smartphone plus alienated kid
equals 12 percent chance of Nazi.
It’s obviously not that simple.
And in my writing,
I’m much more comfortable
being descriptive, not prescriptive.
But this is TED,
so let’s get practical.
I want to share a few suggestions
of things that citizens
of the internet like you and I
might be able to do to make things
a little bit less toxic.
So the first one is to be a smart skeptic.
So, I think there are
two kinds of skepticism.
And I don’t want to drown you in technical
epistemological information here,
but I call them smart and dumb skepticism.
So, smart skepticism:
thinking for yourself,
questioning every claim,
demanding evidence –
great, that’s real skepticism.
Dumb skepticism:
it sounds like skepticism,
but it’s actually closer
to knee-jerk contrarianism.
Everyone says the earth is round,
you say it’s flat.
Everyone says racism is bad,
you say, “I dunno,
I’m skeptical about that.”
I cannot tell you how many young white men
I have spoken to in the last few years
who have said,
“You know, the media, my teachers,
they’re all trying to indoctrinate me
into believing in male privilege
and white privilege,
but I don’t know about that,
man, I don’t think so.”
Guys – contrarian
white teens of the world –
look:
if you are being a round earth skeptic
and a male privilege skeptic
and a racism is bad skeptic,
you’re not being a skeptic,
you’re being a jerk.
(Applause)
It’s great to be independent-minded,
we all should be independent-minded,
but just be smart about it.
So this next one is about free speech.
You will hear smart, accomplished people
who will say, “Well, I’m pro-free speech,”
and they say it in this way
that it’s like they’re settling a debate,
when actually, that is the very beginning
of any meaningful conversation.
All the interesting stuff
happens after that point.
OK, you’re pro-free speech.
What does that mean?
Does it mean that David Duke
and Richard Spencer
need to have active Twitter accounts?
Does it mean that anyone
can harass anyone else online
for any reason?
You know, I looked through
the entire list of TED speakers this year.
I didn’t find a single
round earth skeptic.
Is that a violation of free speech norms?
Look, we’re all pro-free speech,
it’s wonderful to be pro-free speech,
but if that’s all you know
how to say again and again,
you’re standing in the way
of a more productive conversation.
Making decency cool again, so …
Great!
(Applause)
Yeah. I don’t even need to explain it.
So in my research, I would go
to Reddit or YouTube or Facebook,
and I would search for “sharia law”
or I would search for “the Holocaust,”
and you might be able to guess
what the algorithms showed me, right?
“Is sharia law sweeping
across the United States?”
“Did the Holocaust really happen?”
Dumb skepticism.
So we’ve ended up in this
bizarre dynamic online,
where some people see bigoted propaganda
as being edgy or being dangerous and cool,
and people see basic truth
and human decency as pearl-clutching
or virtue-signaling or just boring.
And the social media algorithms,
whether intentionally or not,
they have incentivized this,
because bigoted propaganda
is great for engagement.
Everyone clicks on it,
everyone comments on it,
whether they love it or they hate it.
So the number one thing
that has to happen here
is social networks need
to fix their platforms.
(Applause)
So if you’re listening to my voice
and you work at a social media company
or you invest in one
or, I don’t know, own one,
this tip is for you.
If you have been optimizing
for maximum emotional engagement
and maximum emotional engagement turns out
to be actively harming the world,
it’s time to optimize for something else.
(Applause)
But in addition to putting pressure
on them to do that
and waiting for them
and hoping that they’ll do that,
there’s some stuff that
the rest of us can do, too.
So, we can create some better pathways
or suggest some better pathways
for angsty teens to go down.
If you see something that you think
is really creative and thoughtful
and you want to share that thing,
you can share that thing,
even if it’s not flooding you
with high arousal emotion.
Now that is a very small step, I realize,
but in the aggregate,
this stuff does matter,
because these algorithms,
as powerful as they are,
they are taking their
behavioral cues from us.
So let me leave you with this.
You know, a few years ago
it was really fashionable
to say that the internet
was a revolutionary tool
that was going to bring us all together.
It’s now more fashionable to say
that the internet is a huge,
irredeemable dumpster fire.
Neither caricature is really true.
We know the internet
is just too vast and complex
to be all good or all bad.
And the danger with
these ways of thinking,
whether it’s the utopian view
that the internet will inevitably save us
or the dystopian view that it
will inevitably destroy us,
either way, we’re letting
ourselves off the hook.
There is nothing inevitable
about our future.
The internet is made of people.
People make decisions
at social media companies.
People make hashtags trend or not trend.
People make societies progress or regress.
When we internalize that fact,
we can stop waiting
for some inevitable future to arrive
and actually get to work now.
You know, we’ve all been taught
that the arc of the moral universe is long
but that it bends toward justice.
Maybe.
Maybe it will.
But that has always been an aspiration.
It is not a guarantee.
The arc doesn’t bend itself.
It’s not bent inevitably
by some mysterious force.
The real truth,
which is scarier and also more liberating,
is that we bend it.
Thank you.
(Applause)