The misinformation epidemic: what we can all do to help stop the spread
“It was a gradual process. I just went there step-by-step,” journalist Craig Idlebrook says about becoming an anti-vaxxer.
“I was always trying to figure out how to live more naturally.”
Now years removed from this period in his life, Craig is able to reflect on what lead him down that path.
“It’s sort of like coming out of a fever dream,” he said.
Craig now actively supports the other side of the debate, and tries to nudge people to get the vaccine.
While currently the anti-vaccination movement is the dominant example, it’s just one example of what the spread of misinformation can lead to globally.
Melbourne University lecturer Dr Jennifer Beckett, who researches social media and online community management and moderation, says misinformation is “a hard thing to combat”.
Misinformation has been around since the media has been around.
And while it’s not new, misinformation has never been a more newsworthy issue than it is right now.
RMIT senior lecturer Dr Dana Mckay, whose specialty is information systems, says Covid has had a big impact.
“We’re spending even more time on social media and there’s even more information coming at us,” she says.
Australians spent an average of 1 hour and 46 minutes on social media each day in 2020, according to international creative agency wearesocial.com’s Digital 2021 Australia report.
This was a 10 per cent jump in the time Australians spent online compared to 2019.
“This [information overload] causes people to make bad decisions about information and its veracity,” Dr Mckay says.
The feedback loop
But Idlebrook says social media wasn’t a factor in his becoming anti-vaccine.
“I didn’t actually have a lot of access to the internet,” he said. “We were out in the woods and we were trying to live all naturally.”
He did, however, create an environment that mimicked the echo chamber that is created by the social media algorithm.
“There was sort of a feedback loop with me and my partner … we were isolated and we were feeding off each other’s ideas about that, and so there wasn’t a lot of outside input to stop us,” Idlebrook said.
Craig Idlebrook shares his journey to anti-vax and back. Picture: Tim McGrath, via Zoom.
“We started seeking out publications that fed into this … it just fit perfectly into our counter-culture idea.”
This kind of echo chamber is created purposefully by social media companies such as Meta (formerly Facebook), for each user on their platforms.
Sydney University Associate Professor Fiona Martin, who researches online and convergent media, says social media’s recommendation systems are “designed to be self-reinforcing”.
“You are fed more and more extreme versions of what you’re interested in,” she says.
“When a message is consistent and overwhelming it’s more likely to be believed.”
Tackling the misinformation
Psychological studies show humans aren’t very good at identifying misinformation and fake news.
Dr McKay says governments are asking the big social media companies to combat the problem.
How much they’re actually doing is a very good question.
“Get rid of misinformation and you get rid of one of the major drivers for the eyeballs that fund the advertising that funds the platform.”
Dr Beckett says little has changed.
“Facebook hasn’t really cared terribly much about it except for when it’s going to damage their brand,” she says.
“I don’t really see any kind of genuine, long-term desire to do much about it as long as this misinformation generates profits for those companies.”
Dr Martin, who worked worked with Facebook creating a report on hate speech on the platform, says getting real information was difficult.
“Even when we were being funded by Facebook we couldn’t get access to really solid data,” she says.
“It’s something that is so solidly embedded in the DNA of these companies that I think it’s difficult to expect them to self-regulate, which is why governments are stepping in to regulate around the world,” she said.
With governments stepping in, some measures are beginning to be taken, but their effectiveness is uncertain.
One approach is to flag problematic users and ban them from the platform.
Dr Martin likened it to playing whack-a-mole — as soon as one page is removed another will simply take its place. A banned user can easily make another account, or take the page to another platform.
Another common suggestion is to change the companies’ algorithms.
This presents its own difficulties, Dr Martin says.
“Designing a misinformation algorithm requires human beings to correctly identify misinformation, and we know that we’re not that great at that,” she says.
“What I think is the right approach is an approach based on people.”
The core of the problem
People are at the heart of the misinformation epidemic. People are the ones who spread it and view it, and people are the most effective tool to combat it.
Idlebrook said this begins with understanding that anyone and everyone can be susceptible to misinformation.
There’s no immunity from going down rabbit holes.
“We get so much information at out fingertips, but we have no real wisdom to understand what all that information is and there’s so much of it. I think it can be so easy to get off the beaten path.”
Idlebrook has learned through his own experience that it is the influence of people that you trust that can be instrumental in altering your perspective.
“I started being around more people who were starting to gently – gently, and I think this was important – ‘say, are you sure about this decision about vaccinations?’” he says.
“My wife’s family were very gentle, but persistent, in being like ‘I’m kind of concerned about this’. They just did a really good job of walking a really fine line with me.”
And it is not just close friends or family members that have the ability to get through to people.
While online communities are areas where a huge amount of misinformation gets spread, they can also be a place where people have the power to challenge people’s thinking.
Dr Beckett says there’s a lot of trust in these communities.
“They have a strong, shared emotional connection, a shared sense of values, a shared sense of purpose that is at the foundation of a community.”
If people in online communities speak up and challenge misinformation when they see it, in the right way, it can alter other people’s viewpoint on it.
Dr McKay says this is true in the wider online community.
People are less likely to believe misinformation if they see it challenged in comments or downvoted.
Challenging misinformation in a constructive manner is key, and is in Idlebrook’s opinion the best approach when trying to educate misinformed people such as those in the anti-vax community.
“Be persistent and calm in your messaging, be connected with the person you’re trying to reach,” he says.
Comentários