Aligning Research with Global Issues: How DisinfoLab Evolves with Pressing Challenges
By Adriana Shi ’24
What began as an idea from DisinfoLab analysts transformed into four published articles on Diplomatic Courier. These pieces highlight the researchers’ report about how different factors in information environments interact to create populations that are relatively more or less vulnerable to disinformation. Co-Founder Tom Plant ’22, Co-Directors Aaraj Vij ’23 and Jeremy Swack ’24, and Managing Editor Alyssa Nekritz ’23 began spearheading the project in spring 2022 after Russia’s invasion of Ukraine.
“Russia and the Kremlin have justified their invasion of Ukraine under this false pretense of denazification, and they’ve been very, very proactive in spreading disinformation to support that narrative throughout — not just Russia, but also its surrounding regions of Eastern Europe,” Vij said. “We wanted to look at Eastern European countries that were being impacted by this disinformation and try to understand which ones were more vulnerable to it and which ones were less vulnerable. The idea was that if we can analyze these different countries in Eastern Europe both qualitatively and quantitatively, we can develop policy recommendations.”
To do this, the analysts used Facebook’s API to assess disinformation resilience against COVID-19 and Russia-Ukraine across linguistic groups in Poland, Hungary, and Estonia. Specifically, they analyzed how individual commenters responded and interacted to posts containing disinformation.
“One thing that’s just inherent about disinformation research is that a lot of it requires a lot of data scraping, but at some point someone has to make a qualitative determination about a comment,” Swack said. “That’s something we looked at through an artificial intelligence lens, but at the end of the day, how well can AI do a task like that? So something we had our qualitative analysts do was look at these different comments and make determinations. That’s both a difficult process and a lengthy one.”
In addition, the lab’s qualitative analysts also conducted in-depth case studies on the information environment of the three target countries. This included gathering information about the media literacy rates within each country, the popularity of social media platforms, and the historical impact of disinformation.
With this background research and Facebook data, DisinfoLab found that Russian speakers and Hungarians were most susceptible to pro-Russian disinformation, while Poles were susceptible to COVID-19 disinformation.
“This finding is important because it highlights the role of history and politics in determining which disinformation narratives will likely impact a country’s population,” Plant said. “Hungary’s government has close ties to Russia. Poland has historical trauma with the Kremlin. These differences made a huge impact on the population at large. Therefore, when crafting solutions to combat mis/disinformation, a ‘one-size-fits-all’ approach is bound to leave some countries more vulnerable than others.”
The conclusion of the report offers recommendations for NATO states seeking to combat disinformation in Eastern Europe, such as crafting “approaches around a country’s distinct media environment, considering country-specific factors like state censorship, local history, and linguistic population.”
The impact of their research is seen firsthand through the nearly 3,000 downloads the “Facebook Falsehoods” report has accumulated worldwide — which Nekritz said are the highest the lab has seen. Moreover, this project highlights the lab’s evolution since its founding in 2020, when researchers primarily focused on media literacy.
“I think that it made sense at that moment — COVID-19 was just getting started, so disinformation was ramped up and the 2020 election was going on,” Vij said. “There were a lot of events where I think that DisinfoLab’s role in producing media literacy and fact checking was valuable. After our pilot year, we realized that … the work that we were doing was a niche that was being satisfied by other fact-checking organizations and news organizations. What we got from the “Facebook Falsehoods” report was not so much ‘what lessons do people need to learn?’ but more broadly, ‘what policies can governments put in place or what initiatives can NGOs take to ensure that people that are trying to help actually are cultivating these media literacy skills?’”
As the lab has evolved, it has grown in size and expanded opportunities for collaboration. This fall, DisinfoLab collaborated with WMGIC and NATO headquarters in developing a disinformation policy hackathon that took place at the end of October, and the lab now has the capacity to take on small group or individual projects.
“What set us apart as an undergraduate lab is our capacity to produce original research in an environment where we have access to a number of experts, and also both qualitative and technical prowess to contribute to the field through multidisciplinary ways,” said Vij. “And I think that we’ve really realized the power of that this past year.”
Read the full report.
Read the four published articles on Diplomatic Courier:
- “What Facebook Comments Say About Disinformation in Eastern Europe”
- “Political and Historical Factors of Disinformation Resilience”
- “Large Language Model AI Can Lead Efforts to Combat Disinformation”
- “Facebook Falsehoods: Evaluating Disinformation Resilience in Eastern Europe”
No comments.
Comments are currently closed. Comments are closed on all posts older than one year, and for those in our archive.