On the morning of February 25, 2022, less than 24 hours after Russian forces crossed into Ukraine, a message began spreading across social media platforms worldwide: President Volodymyr Zelenskyy had fled Kyiv. The claim was false. Zelenskyy refuted it by filming himself standing in the capital. But by the time the debunking spread, the disinformation had already reached millions of users across three continents. The second war, the information war, had begun. This wasn’t improvisation. It was doctrine of the Russian military; theorists have long argued that the psychological dimensions of conflict are inseparable from the physical. What changed in 2022 was the infrastructure: social media platforms, algorithmic amplification, and a global ecosystem of automated accounts gave Moscow tools of narrative manipulation with no historical precedent. Understanding how these tools work is no longer optional for students of international relations, but it’s foundational.
The Bot Architecture of Modern Propaganda
A peer-reviewed study by Geissler et al. (2023) analyzed 349,455 pro-Russian messages on Twitter in the weeks following the invasion and found that more than 20 percent were spread by automated bot accounts rather than human users. These bots were not randomly distributed; they were disproportionately active in the countries of India, South Africa, Pakistan, and Brazil that subsequently abstained from or voted against United Nations resolutions condemning the invasion. The correlation is not proof of causation, but it is deeply suggestive of a coordinated effort to shape non-Western public opinion at a critical diplomatic juncture. The Atlantic Council’s Digital Forensic Research Lab documented what it described as Russia’s largest known influence operation on TikTok to date, a campaign seeding narratives about Ukrainian political corruption and NATO aggression, specifically designed to erode Western public support for material aid to Kyiv (Atlantic Council, 2024). The campaign exploited TikTok’s recommendation algorithm, which amplifies emotionally charged content regardless of its factual accuracy, to reach audiences who had never actively sought out geopolitical content.
Classical Techniques in a Digital Shell
What is striking about Russia’s information operations is not their novelty but their continuity. The techniques of bandwagon, fear appeal, and card stacking are precisely those catalogued by Shabo (2008) as the enduring toolkit of propaganda. Bandwagon operates through inflated engagement metrics and bot-amplified trending, creating the appearance of overwhelming consensus. Fear appeal is embedded in narratives about nuclear escalation and NATO expansion. Card stacking selectively presents facts about Ukrainian governance failures while systematically omitting context about Russian state violence. What is new is the precision of delivery. Propaganda is most effective when its targets do not recognize it as propaganda. TikTok’s recommendation algorithm, optimized for engagement and blind to truth, creates exactly this condition. Users receive politically charged content not because they sought it out but because an algorithm determined it would keep them scrolling. The propagandist’s dream of invisible influence has been industrialized.
The Agenda-Setting Dimension
McCombs and Shaw’s (1972) agenda-setting theory proposed that mass media does not tell people what to think but what to think about. Russia’s information operations in Ukraine represent a systematic attempt to control agenda-setting at a global scale to make Ukrainian military setbacks feel more salient than Russian war crimes, to make Western sanctions feel more consequential than Russian territorial aggression, and to make the cost of supporting Ukraine feel higher than the cost of abandoning it. The evidence suggests partial success; despite overwhelming documentation of Russian atrocities, public support for Ukraine aid in several key Western countries showed a measurable decline through 2023 and 2024, a trend that analysts at the European Council on Foreign Relations linked, in part, to sustained information operations targeting European publics. The agenda had been, at least partially, reset.
What International Relations Must Learn
The Russia-Ukraine information war demands a fundamental recalibration of how international relations scholars think about power. Military capability, economic statecraft, and diplomatic influence remain essential, but the capacity to shape how billions of people perceive reality in real time, at scale, through platforms that billions use daily is now an independent variable in the international system. States that ignore this do so at their peril. The response as to the operation of such counter-disinformation efforts, media literacy programs, and platform governance reforms must be conducted with transparency and respect for audience autonomy. The answer to information warfare is not better propaganda. It is better epistemics, and that is, ultimately, a choice about what kind of international order we want to build.
The Russia-Ukraine information war proves that the 21st-century battlefield is no longer just about territory; it’s about the capture of global perception. As an IR student, I believe we can no longer analyze state power through tanks and GDP alone; we must now account for the algorithmic streams that dictate what billions believe to be true. The industrialization of “invisible influence” suggests a dangerous vulnerability; if a state can erode international solidarity from a distance through digital “card stacking,” then traditional sovereignty is under threat from within. Ultimately, the lesson is clear: the answer to information warfare is not better propaganda, but better epistemics. We must choose to build an international order grounded in verifiable reality and platform transparency, or risk surrendering the future to whoever owns the most effective algorithm.

