The Pandemic of Misinformation
The adage that a lie can travel halfway around the world while the truth is putting on its shoes surely resonates today as misinformation—false information that is spread sometimes unintentionally—and disinformation—a deliberate spreading of false information—are growing exponentially. Misinformation is hard to undo and can exacerbate community polarization, creating highly charged situations around politicized issues. In an article in Monitor on Psychology from the American Psychological Association, Stephan Lewandowsky, psychology professor at the University of Bristol in the UK, is quoted: “The fundamental problem with misinformation is that once people have heard it, they tend to believe and act on it, even after it’s been corrected.” He added, “Even in the best of all possible worlds, correcting misinformation is not an easy task.”
The Psychology Behind Misinformation
Norbert Schwarz, a professor of psychology and marketing at the University of Southern California, suggests that people use five criteria to judge the credibility of new information: compatibility with other known information, credibility of the source, whether others believe it, whether the information is internally consistent, and whether there is supporting evidence. His research also reveals that “people are more likely to accept misinformation as fact if it is easy to read or hear.”
Others have built on Schwarz’s work, studying individual traits that may make people more susceptible to misinformation. Psychologist Peter Ditto of the University of California, Irvine, uncovered that “people deploy skepticism selectively—for instance, when they’re less critical of ideas that align with their political beliefs.”
The pandemic has created conditions that have fueled the rise of misinformation and disinformation to such a degree that the World Health Organization has declared “a parallel infodemic.” In 2020, psychologist Daniel Romer, research director of the University of Pennsylvania’s Annenberg Public Policy Center, conducted a study that examined the spread of conspiracy theories around the coronavirus, finding that “15% of study respondents believed the pharmaceutical industry created the coronavirus and more than 28% thought it was a bioweapon made by the Chinese government.” He also uncovered that acceptance of these theories correlated with a decreased willingness to wear a mask or take a vaccine. This may help explain, in part, why some school communities have seen such heated debates about masks and vaccines for students and faculty members.
Tools and Techniques of Misinformation and Disinformation
Misinformation also has been spreading about schools’ diversity, equity, and inclusion initiatives, causing questions, concerns, and fear in many communities. Although there must be open dialogue in every school community, in many instances, the originators of much of the information about schools’ programs are not affiliated with schools. Tools and techniques available today allow any actor to quickly spread misinformation through media channels and platforms, leaving consumers of the information hard-pressed to know what is true.
Data & Society, a nonprofit that studies the social implications of data-centric technologies and automation, is attempting to educate the public on these techniques with the goal of helping people more effectively gather credible information. Data & Society highlights four techniques that fuel the spread of misinformation and disinformation:
Source hacking: a technique for feeding false information to journalists, investigators, and the general public during breaking-news events or across coverage of highly polarized issues to obscure the authorship of false claims. This is accomplished in four different ways:
Data craft: manipulating a system to assert power over it. Bad actors typically manipulate platform usernames, profile handles, bio fields, post dates, follower counts, and other metadata to spread disinformation and influence public discourse, typically on a highly politicized topic. For example, manipulators can generate clicks and fake engagement through “astroturfing” (the practice of masking the sponsors of a message or organization to make it appear as though it originates from and is supported by grassroots participants) and “botnets,” which in turn can generate more reshares, likes, and engagement.
- Viral sloganeering: repackaging reactionary talking points for social media and press amplification. Because these are easily transmitted and copied, they can quickly spread to public forums, both online and off, and thus become far removed from the group that created them.
- Leak forgery: sharing forged documents to incite news coverage about a hot-button topic.
- Evidence collages: compiling information from multiple sources into a single, shareable document, usually as an image.
- Keyword squatting: using keywords and sockpuppet accounts (an online identity used for purposes of deception) to misrepresent groups or individuals.
Deep-fakes and Cheap-fakes: videos that have been altered through some form of machine learning to “hybridize or generate human bodies and faces,” or other types of AV manipulation. Data & Society offers a spectrum diagram to outline all of the ways these techniques are used to deceive.
Data voids: capitalizing on missing data, the logics of search engines, and the practices of searchers to help drive attention to a range of problematic content. These techniques are increasingly being adopted by networks of people invested in polarizing society.
Resources for Help
Given the complexity and growth of these techniques, how does a school leader keep the community engaged in dialogue around true facts? It certainly starts with a commitment to ongoing communication and transparency, but more help is needed. Thankfully, there are organizations hard at work developing resources that can help schools understand sources of misinformation, improve communication with their communities, and educate their students in media literacy.
There has and always will be disagreement in school communities about practices and programs. Educators welcome and encourage dialogue so that they can build communities of understanding and support for all. This is at the heart of a healthy and thriving community. We must do that, though, with the knowledge that disinformation is rife and the motivations of those spreading falsehoods are not always clear. We must investigate the sources of information we rely on and teach media literacy within our school communities as we seek to find a collective path forward.
- The News Literacy Project, a nonpartisan educational nonprofit, provides programs and resources for educators and the public to help them become active consumers of news and information and equal and engaged participants in a democracy. They offer resources specifically for educators including a free newsletter, an online community, professional development, an app to practice and reinforce news literacy skills, and a platform that helps students in grades 6–12 to easily identify misinformation.
- First Draft, a nonpartisan organization, strives to protect communities from harmful misinformation by empowering people with the knowledge, understanding, and tools needed to outsmart false and misleading information. They provide research, resources, and training to help communities battle misinformation. One particularly helpful resource is a three-part series on the psychology of misinformation.
- Factcheck.org is a project of the Annenberg Public Policy Center, dedicated to serving as a nonpartisan, nonprofit “consumer advocate” for voters that aims to reduce the level of deception and confusion in U.S. politics. They monitor the factual accuracy of what is said by major U.S. political players in the form of TV ads, debates, speeches, interviews, and news releases. You can search their website by person or topic to investigate misinformation.
- AllSides for Schools, a nonprofit program, gives educators tools, resources, information, and curricular guidance to help students build skills in news literacy, bias awareness, critical thinking, and conversation across difference. They offer classroom activities, lesson plans, and Mismatch, an online conversation platform that offers students practice in engaging in civil dialogue.
- The Reality Team includes communications professionals, technologists, and cybersecurity and disinformation analysts from the private and public sector who are committed to pushing back against disinformation. Their content is entirely independent and nonpartisan, focused on dispelling myths, rumors, and disinformation. Among their resources is a guide to conducting a fact-check in 30 seconds.