Eurasia Review Interviews: Disinformation, Platforms, And Cognitive Security In The Algorithmic Age
An interview with Yuliia Dukach, PhD, Head of Disinformation Investigations at OpenMinds
Disinformation is no longer best understood as a sequence of viral falsehoods or episodic campaigns. It now operates as a persistent feature of the digital information environment, shaped as much by platform architectures, data access regimes, and automated systems as by state intent.
In this conversation, Yuliia Dukach reflects on how contemporary influence operations function in practice: how enduring narratives attach themselves to events, how social platforms condition behavioural impact, and how generative AI is transforming the scale and economics of information manipulation without altering its strategic objectives.
Drawing on extensive investigative work in Ukraine, one of the most heavily targeted information environments globally, Dukach examines the limits of platform moderation, the structural constraints researchers face due to restricted data access, and the growing challenge of distinguishing authentic public discourse from synthetic participation.
She also considers which elements of Ukraine’s experience are transferable to other regions, including the Global South, and where local political and social conditions impose hard limits.
Rather than focusing on individual incidents, this interview examines influence operations as systems, embedded in digital infrastructures, amplified by algorithms, and increasingly directed at the information ecosystem itself.
Q. Disinformation is often discussed as a series of isolated incidents or campaigns. From your investigative perspective, how should we instead understand contemporary influence operations as long-term cognitive and societal challenges?
A: I would challenge the premise that disinformation is primarily discussed as a series of isolated incidents or campaigns. However, I do believe that disinformation can often be studied most effectively at the level of discrete information operations.
In Ukraine, which has been resisting targeted and destructive Russian information operations since at least 2014 — following the annexation of Crimea and the start of the war, including the occupation of parts of Donetsk and Luhansk regions — there is already a fairly clear understanding of the core narratives and strategic objectives behind these efforts. What tends to change is not the underlying logic, but rather the specific news hook, the content production script, and, gradually, the technologies and channels of dissemination.
The better we understand these persistent narratives, the easier it becomes to anticipate which events they are likely to be attached to, where to look for emerging information operations, and how to design effective counter-narratives or prebunking strategies. For example, we know that Russia has a strategic interest in reducing international support for Ukraine, lowering the number of people willing to serve in the Ukrainian military, and undermining trust in Ukrainian state institutions.
In some respects, Ukrainians may even find this environment slightly more legible. Living through an active war — under occupation, daily drone and missile attacks, or often without electricity, water, or heating during the winter — leaves little ambiguity about who the adversary is and what they are trying to achieve.
The situation is more complex in democratic systems elsewhere. In countries that Russia openly frames as part of its “sphere of influence,” every election becomes a stress test, while narratives that frame counter-disinformation efforts as censorship (or vice versa) tend to amplify, rather than constrain, the impact of influence operations.
Q. Russia conceptualises its influence activities as “information war”, implying a state-to-state form of confrontation rather than the language of strategic communications or hybrid warfare. How useful is this framing for understanding contemporary influence operations?
A: Despite claims that information operations have become fully “marketised” and decentralised, in Russia, they are still often conducted by military units or militarised structures. In Ukraine, for example, during the 2022 full-scale invasion, these actors operated in parallel with kinetic occupation: alongside physical control of territory, they built and managed networks of local Telegram channels designed to legitimize that occupation. From this perspective, “information war” is not a metaphor but a component of military conflict.
This logic increasingly extends beyond Ukraine. Across Europe, information and cyber operations are accompanied by subversive activities carried out by individuals recruited by Russia via social media. In parts of Africa and Latin America, journalists have documented dark schemes of recruitment into the Russian military or defence-related industries.
At the........
