Bethany S. McGowan, Purdue UniversityFollow
Matthew Hannah, Purdue UniversityFollow
Sofia Babcock, Purdue UniversityFollow
Katelyn Biggs, Purdue UniversityFollow
Lara Chuppe, Purdue UniversityFollow
Christina Galiatsatos, Purdue UniversityFollow
Jannine Huby, Purdue UniversityFollow
Michael Kuczajda, Purdue UniversityFollow
Bennet Miller, Purdue UniversityFollow
Stephanie Perun, Purdue UniversityFollow
Amanda Shie, Purdue UniversityFollow
Alicia Stevance, Purdue UniversityFollow
Andrew Yason, Purdue UniversityFollow
Charlotte Yeung, Purdue UniversityFollow
Dis/misinformation was a major concern in the 2016 U.S. presidential election and has only worsened in recent years. Even though domestic actors often spread dis/misinformation, actors abroad can use it to spread confusion and push their agenda to the detriment of American citizens. Even though this report focuses on actors outside the United States, the methods they use are universal and can be adapted to work against domestic agents. A solid understanding of these methods is the first step in combating foreign dis/misinformation campaigns and creating a new information literacy paradigm.
This report highlights the primary mechanisms of dis/misinformation: multimedia manipulation, bots, astroturfing, and trolling. These forms of dis/misinformation were selected after thorough research about common pathways dis/misinformation are spread online. Multimedia manipulation details image, video, and audio dis/misinformation in the form of deepfakes, memes, and out-of-context images. Bots are automated social media accounts that are not managed by humans and often contribute to dis/misinformation campaigns. Astroturfing and trolls use deception to sway media users to join false grassroots campaigns and utilize emotionally charged posts to provoke a response from users.
This policy report also defines case studies of disinformation in China, Russia, and Iran, outlining common patterns of dis/misinformation specific to these countries. These patterns will allow for more accurate and quick identification of dis/misinformation from the outlined countries by State Department Watch Officers. Recommendations have also been provided for each type of disinformation and include a list of what individuals should look for and how to make sure that the information they receive is accurate and from a reputable source. The addendum at the end of the paper lists all of the recommendations in one place so that individuals do not have to search the paper for the recommendation they are looking for.
This report intends to aid State Department Watch Officers as they work to identify foreign developments accurately. Still, researchers may find this information useful in anticipating future developments in foreign dis/misinformation campaigns.
KeywordsMisinformation, Disinformation, Diplomacy, Foreign Affairs, Information Literacy
Date of this Version12-2022
Recommended CitationMcGowan, Bethany S.; Hannah, Matthew; Babcock, Sofia; Biggs, Katelyn; Chuppe, Lara; Galiatsatos, Christina; Huby, Jannine; Kuczajda, Michael; Miller, Bennet; Perun, Stephanie; Shie, Amanda; Stevance, Alicia; Yason, Andrew; and Yeung, Charlotte, "Identifying Dis/Misinformation on Social Media: A Policy Report for the Diplomacy Lab Strategies for Identifying Mis/Disinformation Project" (2022). Libraries Faculty and Staff Scholarship and Research. Paper 266.
https://docs.lib.purdue.edu/lib_fsdocs/266
DOWNLOADS
Since December 09, 2022
COinSRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4