By Megan Hogan ’21 and Thomas Plant ’22
This blog post details an example of funding from GRI’s student innovation window. Learn more about how you can pitch ideas.
Oddly enough, our interest in disinformation started not with Russia’s interference in the 2016 presidential election, but instead with deepfakes—highly realistic, fake videos. We explored this research topic through GRI’s Project on International Peace and Security, where Megan conceived of the idea as a fellow in the fall of 2019, and Thomas joined her as an intern the following semester. The resulting white paper (pdf) analyzed the costs and benefits to weaponized deepfake technology. Given a shared experience in disinformation and a solid team dynamic, we decided to team up as co-directors of a new student-led research lab on disinformation, DisinfoLab.
In June 2020, we sought innovation funding from the Global Research Institute, which aims to bring students’ bright ideas to fruition. The concept of the lab was well-received, but more information and thought was necessary to advance the project. We then spent the next month and a half refining the lab: developing a budget, determining the number and type of deliverables, formulating the team hierarchy, and searching the internet for free disinformation datasets. GRI officially signed off on the lab on August 21st, awarding DisinfoLab a grant for the academic year 2020-2021.
DisinfoLab’s mission is two-fold. First, we aim to improve nationwide media literacy rates. Today, social media and search engine algorithms have emerged as novel ways to share global news. Unfortunately, they also spread disinformation more efficiently, amplifying fake news and conspiracy theories within networks of media illiterate users. Community guidelines and fact-checkers cannot hope to evaluate every piece of information posted online.
Across its social media pages, DisinfoLab uses catchy graphics and bright colors to clarify disinformation that circulates online.
Unsurprisingly, the validity of both true and false information is being debated online today. We see this trend not only with individual accounts, but also with established and respected media and research institutions—even their information is coming under scrutiny and distrust. In this media environment, the decay of trust poses the question, “What can we really believe?” DisinfoLab equips users with the tools and skills necessary to navigate that question. Over our social media network, DisinfoLab teaches users skills to confirm reliable sources, debunk the false ones, recognize their own cognitive biases, and understand how they can impact their social circle online by upholding these principles.
DisinfoLab’s second goal is to expose false, dangerous, and malicious information online. We dedicate exhaustive coverage to issues of disinformation while contributing our own research to this emerging field. Our analyst team tracked bot data to expose that the “incriminating” emails on Hunter Biden’s hard drive were part of a disinformation campaign. On the day of the January 6th Capitol insurrection, we revealed that bots were amplifying a narrative suggesting that disguised Antifa members started the riot. Most recently, we found that bot accounts amplified Andrew Cuomo’s COVID-19 controversy about the underreporting of nursing home deaths.
How exactly do we track this disinformation? Our analyst team processes and analyzes trends of inauthentic information from two online platforms, Trends24 and BotSentinel. Using a script written by one of our analysts, the data are aggregated into spreadsheets that our analysts then use to categorize data (according to a team code book) and visualize trends throughout the day. Once data has been collected and coded, the Social Media Analyst team generates reports on significant trends, which are then turned into posts on the DisinfoLab network.
In the future, we hope to expand DisinfoLab’s reach by increasing the amount of information we make available to the public. In the last half of this semester, DisinfoLab has shifted energy to individual research, allowing analysts the opportunity to write and publish op-eds using the troves of bot data that we have collected. Additionally, we plan to reach out to experts in the field and invite them to partake in recorded interviews. These discussions will be made available through our expanding social media network. When our pilot year concludes, we will be ready in the second half of 2021 to push DisinfoLab towards a wider audience, promoting research and conversations about the perilous state of truth, and what we can do to uphold it.
To keep up with DisinfoLab, follow along on Twitter and Instagram @disinfo_lab, and on Facebook at DisinfoLabWM.
Each year, GRI students have the opportunity to partner with other students, faculty, staff, and alums to receive start-up funds to pursue their research ideas. The Global Research Institute was founded when students asked tough research questions and worked with faculty mentors to create new knowledge. The next innovation window will open in fall 2021. If you have questions about your idea or the application process, please reach out to Rebecca Latourell, Director of Programs & Outreach.