Google tries “pre-bunking” in an effort to counter misinformation

0

In the days leading up to the 2020 election, social media platforms began experimenting with the idea of ​​“pre-bunking”: pre-emptively debunking misinformation or conspiracy theories by telling people what to watch out for. .

Now, researchers say there’s evidence the tactic may work — with help from Homer Simpson and other well-known fictional pop culture characters.

In a study published Wednesday, social scientists from the University of Cambridge and Google reported on experiments where they showed 90-second cartoons to people in a lab and as advertisements on YouTube, explaining in simple, non-partisan language some of the most common manipulation techniques.

The cartoons succeeded in raising awareness of common misinformation tactics such as scapegoating and creating a false choice, at least for a short time, they found.

The study was published in the journal Science Advances and is part of a broad effort by tech companies, academics and news outlets to find new ways to rebuild media literacy, as other approaches such as traditional fact-checkers have failed to make a dent in online media. disinformation.

“Words like ‘fact-checking’ themselves become politicized, and that’s a problem, so you have to find a way around that,” said Jon Roozenbeek, lead study author and postdoctoral researcher at Social Decision. -Making Lab of the University of Cambridge. .

The researchers compared the effects to vaccination, “inoculating” people against the harmful effects of conspiracy theories, propaganda or other misinformation. The study involved nearly 30,000 participants.

The latest research was compelling enough for Google to adopt the approach in three European countries – Poland, Slovakia and the Czech Republic – to “pre-jump” anti-refugee sentiment around people fleeing Ukraine.

The company said it has no plans to release “pre-overlay” videos in the US before the midterm elections this fall, but said it could be an option for future cycles. electoral. Or it’s a cause that an advocacy group, nonprofit or social media influencer could take on and pay for on their own, Google and the researchers said. (The videos are “freely available for anyone to use as they wish,” their YouTube page says.)

To avoid putting off political supporters, the researchers created their five cartoons without using real political or media personalities, choosing instead to illustrate their remarks with fictional characters.

Click here to see the cartoons.

A cartoon explains the concept of an ad hominem attack, in which a person attacks someone who makes an argument rather than addressing the substance of the argument themselves. It features a brief clip from “The Simpsons” to illustrate its point, while other cartoons feature characters from the “Star Wars”, “South Park” or “Family Guy” franchise.

The result is videos that are half rhetorical, half deep in pop culture.

“We can, in a very apolitical way, help people resist manipulation online,” said Beth Goldberg, research manager at Jigsaw, a Google subsidiary that researches misinformation and other topics. She is a co-author of the study, and Jigsaw funded the study and the Ukraine-related media campaign.

Cambridge researchers previously created an online game called “Bad News” to teach people about shady media practices, but it required people to sign up.

The cartoons, however, were shown as advertisements on YouTube and were therefore harder to miss. The cost of the ads was around 5 cents per view. And to measure the effect, the researchers used the same technology that YouTube has in place for corporate ad campaigns.

A day after viewing one of the videos, a random subset of participants were given a one-question quiz to test how well they recognized the manipulation technique featured in the ad. The researchers found that a single video ad increased recognition by about 5% on average.

Researchers have acknowledged some downsides. For example, they don’t know how long the “inoculation effect” persists – a question Goldberg said they are currently investigating.

Brendan Nyhan, a government professor at Dartmouth College who was not involved in the study, said the results show inoculation against false claims has potential.

“It advances the state of the art by demonstrating these effects through several pre-recorded studies and showing that they can be obtained in the field on YouTube and that the effects appear to persist at least briefly after exposure,” said he said in an email.

A “pre-bunking” campaign may do little to stem the tide of misinformation from important sources such as far-right influencers on YouTube, said Shannon McGregor, senior communications researcher at the University of North Carolina, Chapel Hill. She also did not participate in the study.

“Ultimately, the authors propose that those concerned about misinformation on social media (including YouTube) spend more money on these platforms to serve ads to protect against misinformation. respects, this is totally unsatisfactory for virtually every stakeholder except the platforms,” she said in an email.

Some attempts to counter misinformation have failed. In 2017, Facebook removed a feature that placed a “disputed” flag on certain news posts, after academic research found the flag could entrench deeply held beliefs.

Interest in “pre-bunking” misinformation has been widespread for some years. Twitter used “pre-bunking” on topics such as ballot security in the days leading up to the 2020 election, while Facebook and Snapchat invested resources in voter education. Other efforts have focused on Covid misinformation.

Meanwhile, YouTube has grown in prominence as a source of political news and partisan warfare.

Roozenbeek said he was optimistic that “pre-bunking” videos could educate social media users about manipulation tactics, although they wouldn’t entirely solve the misinformation problem.

“It’s not the end of everything, everything platforms should do in my opinion to fight misinformation,” he said.

YouTube, which operates separately from Jigsaw as a division of Google, declined to comment on the study.

Goldberg said “pre-bunking” videos aren’t designed to replace content moderation programs tech companies have in place to detect and remove posts that violate their rules, but she said moderation of content was not enough given the volume of misinformation.

“It’s hard to search for every viral piece of information,” she said.

But with the ‘pre-bunking’ videos, she added: “We don’t have to anticipate what a politician is going to say or what the vaccine misinformation campaign is going to say next week. We just have to say : “We know that there will always be campaigns of fear.”

Share.

About Author

Comments are closed.