By Rebecca Rayburn-Reeves

 

 

An Era of Misinformation – the Infodemic

False news that is shared non-intentionally or intentionally (often referred to as misinformation and disinformation, respectively) has continued to rise in prevalence. This has especially become an issue around U.S. electoral process, where there have been massive disruptions in voting behavior and reduced trust in American democracy. In 2022, the U.S. House of Representatives Committee on Oversight and Reform published a report on election-based disinformation. They stated that “…the greatest current threat to democratic legitimacy now comes from lies by domestic actors who seek to convince Americans that their election systems are fraudulent, corrupt, or insecure.”  

Indeed, in a 2020 Quinnipiac poll, 38% of voters believed that there was widespread fraud in the 2020 presidential elections. Meanwhile, looking at polls of U.S. adults from the last few decades reveal steadily decreasing trust in government, along with an increased belief in various kinds of misinformation, such as election fraud. This is supplemented by the majority of U.S. adults believing that misinformation has caused a great deal of confusion about basic facts related to current events. Unsurprisingly, around one in four Americans have admitted to sharing misinformation online at some point, whether they knew it or not at the time. As stated by the WHO, we are now in an infodemic, a period characterized by excessive amounts of information for which a large portion is considered mis- and disinformation. 

In Search of a Cure

There have been a handful of attempts to combat the deleterious effects of misinformation on the public.  

Some successful research attempts have involved intervening right at the moment of exposure.  Message alerts are attached to incoming news, asking readers to think about the accuracy of the information they are reading (called accuracy nudges), as well as various systematic attempts at debunking existing and pervasive misinformation. But perhaps one of the most promising types of interventions comes in the form of games that teach players about strategies used to spread misinformation through attitude inoculation

In the same way that a vaccine introduces the body to a weakened form of a virus or pathogen, we can introduce people to weakened or exaggerated forms of misinformation. In attitude inoculation, when exaggerated or weakened forms of misinformation are provided alongside an explanation of the manipulative strategy used within the content, the brain then develops cognitive antibodies (or associations) designed to detect that strategy when it is presented. 

You can think about attitude inoculation like a spoiler effect in movies. If I tell you who the killer is and how they will go about killing other people, watching that exact script unfold afterward really mutes the effect. Therefore, attitude inoculation works to warn about the existence of manipulative strategies and how they are designed to create viral misinformation. By doing this, attitude inoculation (more specifically through gamification) offers a form of cognitive protection against future manipulation attempts. 

Online Games Can Help Build Resistance to Misinformation & Disinformation

Several existing games aim to teach attitude inoculation, with focuses ranging from misinformation around COVID-19 to elections, and climate change.

One of the first of these games, Bad News, has the player take on the role of a mis or disinformation spreader. Throughout the game, players learn how to use seven different types of strategies, or tactics, to sow discourse and spread mis/disinformation in a fictitious online environment, all while trying to recruit as many followers as possible and build a certain level of credibility. 

For example, players learn to use conspiratorial language to spread misinformation online, which might include outlandish claims about powerful groups or individuals trying to cover up some fact that they don’t want the public to know.

The Bad News game, along with a handful of similar games (e.g. GoViral, Factitious, Harmony Square), have been studied both in the lab and in the field. Results are both positive and consistent across games. After playing the games, people have shown an improved ability to spot manipulation techniques in social media posts, as well as a self-reported reduced willingness to share manipulative content. 

It turns out that people don’t actually like being manipulated. Teaching people about these techniques, therefore, helps them to become better able to identify them in real-world news content while also reducing the likelihood of them sharing that content. 

Because of this, online games teaching players about strategies used by mis- and disinformation spreaders can help build resistance against manipulative news content. More hopefully, much of the research also shows a reduction in the likelihood of sharing inaccurate or manipulative content after gameplay. 

A New Game for the Books – Politricks Helps People Spot Manipulative Election-Based Misinformation

Recently, researchers have created a new version of the inoculation games designed to combat election-based misinformation in the United States. The Center for Advanced Hindsight at Duke University has launched its game, Politricks, which has recently been shown to improve people’s ability to discern manipulative from non-manipulative news content about the electoral process.

In this game, players enter into a game show where they are greeted by a host, Eve Dense, and are then challenged across three rounds to learn from and then defeat three different contestants – each of which embodies one common manipulation technique used in the spread of election-based misinformation. The three characters are aptly named Ann McDotal (the emotional storyteller), Cam E. Leon (the conformist who teaches the bandwagon fallacy), and Tim Foylhat (the conspiracy theorist). 

This game is now being tested and can be played for free, here

In the end, misinformation will likely never be resolved or eliminated from our discourse, but we can learn to better equip ourselves with tools to spot it when it crosses our paths. Combining these individual or grassroots efforts with efforts to curb the distribution of misinformation coming from major news organizations and journalists provides us with one path forward through the noise.  

 

 

This article was edited by Shaye-Ann McDonald.

Rebecca Rayburn-Reeves
Rebecca Rayburn-Reeves received a Masters in Psychology from UNCW in 2007, a PhD in Experimental Psychology from the University of Kentucky in 2011, and completed a 2-year postdoctoral fellowship at Tufts University in 2016. Her research has involved working with both human and non-human animals, studying the limits of, and changes in, short and long-term memory, behavioral flexibility, number and time discrimination, social influence, and decision-making under uncertainty.