The Nation: Author of "A Matter of Fact" Jess Berentson-Shaw
On Newshub Nation: Simon Shepherd interviews author of "A Matter of Fact" Jess Berentson-Shaw
• Jess
Berentson-Shaw describes misinformation as ‘sticky’. She
says once information embeds, its difficult to remove it
from people. "What we do know is simply negating it, so, for
example, myth-busting, isn’t a particularly effective way
to remove a well-embedded belief."
•
• She says
the issues of free speech and misinformation are being
'conflated' and the the big issue is misinformation. She
says Canadian speakers Stefan Molyneux and Lauren
Southern have been "very clever at
making a free-speech argument for what is essentially a
misinformation argument."
•
• She says if we want
an independent voice in the role of the PM's science advisor
"then possibly we need to talk about an alternative system
– a little bit like the environmental advisor."
•
Simon Shepherd: The
phrase fake news might've been popularised recently by a
certain US president, but the phenomenon is nothing new. But
we are now exposed to much more information and
misinformation than ever before and it's getting harder to
tell the difference. Researcher Jess Berentson-Shaw's
written a new book on the topic, called "a Matter of Fact".
I began by asking her how you develop the skills necessary
to distinguish what's fact and what's
fake.
Jess Berentson-Shaw:
You know, that’s a really good question, because
I think as we have democratised the availability of
information, we haven’t democratised the skills to assess
what information is good information.
Simon
Shepherd: What do you mean by democratise those
skills?
That means that what was previously
available to us to assess whether information was
trustworthy or reliable – we used to do it through
relationships. Like if we went to our doctor, and we trusted
our doctor to have good information. And now, you or I, we
go on the Internet, we Google – it’s impossible for us
to know if the study that we see on vaccinations is a good
quality study or a bad quality study. So while we can get
all the information that we need, we don’t know if it’s
good information or not.
All right. And
you’ve said in your book that this misinformation is
‘sticky’. So we’ve seen it in everything, like
conspiracies about moon landings and vaccinations being
linked to autism. So how can you persuade someone that
something they believe isn’t actually
true?
Well, it’s really hard, and this is
the problem that once misinformation embeds, it’s very
difficult to remove it from people. What we do know is
simply negating it, so, for example, myth-busting, isn’t a
particularly effective way to remove a well-embedded belief.
There is some suggestion that the best way to actually deal
with it is to stop it from getting out there in the first
place, and that’s a really important point to make about
how much misinformation is available.
Look,
there were just two points there. So if you’re saying if
you present someone with the facts, it doesn’t necessarily
mean you’re going to sway them.
No, not at
all.
And is that frustrating for a
scientist?
I think for a lot of people who
work with knowledge transference science, it’s really
frustrating. We know what good information is, and we have a
really deeply held belief that it should be used to improve
the way that people’s lives are and it’s ignored or,
even worse, bad information – like, for example, with meth
testing – is used to develop policy.
So what
is the trick to presenting facts on a controversial issue
like that?
It’s not simple. One of the
things which we talk about is the importance of being able
to tap into the values that are important to people. What we
know is that logic comes very late in the process. When
people get new information, what they first filter it
through is, ‘Does this matter to me?’ and ‘Does it fit
with my beliefs that I already have?’ So one of the things
that we can do is think about what are the helpful values at
the base of this information. So an example of that is
climate change. Doing something about climate change is a
really important activity for human survival. So we need to
talk about that mattering and looking after the climate is
mattering and looking after each other is mattering.
That’s one way to start engaging with people’s values
before we talk about the facts.
You also talk
about something called ‘pre-bunking’. What is
pre-bunking?
So pre-bunking is this idea
that before people are exposed to bad information, we
actually warn them that they may be exposed to poor
information, say, in the vaccinations space and the
motivations of people.
Okay. So what about the
fact that… Do you outline the bad information and then say
it’s wrong? Or do you just ignore it at
all?
If you’re pre-bunking, so if you
think that people haven’t been exposed to poor
information, which in this day and age is pretty unlikely
given how much information and how available it is. But
actually, what the research suggests is it’s actually
better not to engage with bad information at all. So – and
this is a classic communications technique in lots of ways
– create your own story about the good
information.
So in terms of information, the
way the internet works, we end up with information that
reflects our existing values, beliefs and the way we search
for things. So how do you get yourself out of the echo
chamber of your own belief system?
Yeah, and
that is really difficult, isn’t it – the idea that we
need to think about our own bias and slow down. I think that
individual behaviour change in that scenario is quite
difficult for people to do. Asking people to actively step
outside their own bias is tricky. I think that people with
skin in the information game, that should be one of our core
skills that we’re taught. I think for everybody else, I
think there are things we need to think about structurally
speaking in order to help reduce the exposure of
misinformation.
Okay. That’s hard to do in
this current environment. The term ‘alternative facts’
was coined by Kellyanne Conway, part of the Trump
Administration. Has Trump’s administration made it easier
for people to dismiss things they don’t like as ‘fake
news’?
I think it’s given it a name.
We’ve always done it, though. Misinformation is not new.
Historically, we’ve been doing this for hundreds, if not
thousands, of years. The tools and the digital media tools
that we have allow that misinformation to be manipulated
more easily and spread more easily. So that’s a real
challenge to us.
There’s no filters.
There’s no relationships based on this. It’s a
free-for-all.
Yeah. And there are no editors
in between, say, me and poor information or no medical
professionals or people who we would traditionally think of
experts. But I think there is a really big question around
who are experts and whether expertise is what we should be
relying on in terms of getting people to believe information
anymore. The old ways of saying ‘I’m an expert, listen
to me’ just don’t apply anymore.
Okay.
Well, speaking of alternate facts, what we’ve just
mentioned, the right to free speech has been a big argument
at the moment. Is the right to free speech a defence for the
deliberate spreading of misinformation?
I
think these two things in New Zealand certainly are perhaps
being conflated. I think the bigger issue, actually, is the
spread of misinformation. And that misinformation is so
powerful, it has an advantage over good information that we
need to be thinking very carefully about what is the
platform we’re giving to
misinformation.
I’m thinking about the
recent visit by the alt-right Canadian speakers Stefan
Molyneux and Lauren Southern. They say their views were
based on science and they’re putting on a scientific
rationale. So how do you fight that?
Yeah,
and that’s tricky. They’ve been very clever at making a
free-speech argument for what is essentially a
misinformation argument. The views that they have were both
racist and sexist, which we know are based on poor
information. They’ve been very clever in manipulating
people’s fears that if we don’t listen to that, that we
are at risk of less free speech.
So what do
you do with people like that? Do you engage them, or do you
ignore them?
I think there’s a couple of
things we can do. There’s a question about the way in
which we give them a platform allows that information to be
repeated, and that is problematic. What we know is that the
more information is repeated, regardless of where it comes
from, the easier it is for people to believe. We’re not
very good at remembering the sources of information. So I
think there are some ethical questions about the platforms
which people like that have. I think that it is important to
address the misinformation, but perhaps head-on – the
research suggests – isn’t as useful, and that we need to
amplify correct stories about things like the value of Maori
culture and the value of equality and fairness between
genders.
But it’s very hard to overcome
ingrained negative perceptions, isn’t
it?
It’s very, very difficult. So you hear
something – people talk about the backfire effect. And
this is when if we directly challenge people’s incorrect
beliefs, that they sort of double down on it. It depends
somewhat on the issue that you’re talking about and how
strongly people believe it, but, yeah.
You
mentioned before that maybe we shouldn’t believe the
experts as we used to. How important is it if I’m being
given a message by somebody that I identify with someone
who’s giving me that message?
Yeah, really
important actually. It’s perhaps not as simple as we
assume it to be. It’s much more about, ‘Can I see that
you and I share values?’ If I can see that you and I share
values, then I’m much more likely to listen to what you
say and perceive that you have expertise in the
area.
All right. The Prime Minister has a
science advisor who can be terminated at the will of the
Prime Minister. Are they independent enough in their current
role?
I think it depends on what the role of
that science advisor is, and it’s not particularly clear
at the moment. Are they simply about amplifying the benefits
of science, or are they there to provide an independent
voice about what works in research? And I think if we want
an independent voice, then possibly we need to talk about an
alternative system – a little bit like the environmental
advisor. And perhaps like the Children’s Commissioner as
well. So I think because they are not necessarily there for
very clear independent advice reasons, then it is possible
that that information might not be listened to, as Peter
Gluckman himself said when he talked about the meth testing,
that he was pushed back on it.
That’s a
clear example of where he wasn’t listened to. Okay, thank
you very much for your time, Jess
Berentson-Shaw.