Fabricating False Support: Russias Bot Network Crafts Pro-Occupation Voices in Ukraine

On Russian social networks, some individuals seem like average citizens from occupied Ukrainian cities, lauding Moscow’s «reconstruction» initiatives, applauding the courage of Russian soldiers in combat, and criticizing the so-called «neo-Nazi» regime in Kyiv. Occasionally, they even engage in disputes among themselves.

Consider the VKontakte account of Roman Koshelev, who features a black-and-white gym selfie accompanied by the simple slogan “Sport is life.”

This 29-year-old posts daily in support of the Kremlin’s so-called “special military operation” and regularly condemns Ukraine. He follows groups with titles such as “Contract Service in the Russian Defense Ministry” and numerous regional news outlets from across Russia.

However, Roman is not a real person. His profile picture was taken from a sports-nutrition channel on Telegram, and his entire account — similar to thousands of others — is part of a vast network of fake accounts crafted to generate pro-Kremlin sentiment online.

Through the Botnadzor service, which tracks bot activities linked to Russia, The Moscow Times discovered that fraudulent accounts frequently impersonate Ukrainians to promote favorable narratives on VKontakte pages managed by Kremlin-aligned authorities in occupied territories. Their comments, responses, and even discussions create the illusion of public backing for Moscow in these areas.

Online influence operations by Russia have existed long before the large-scale invasion of Ukraine. In 2013, Novaya Gazeta uncovered an office in St. Petersburg where paid «trolls» were tasked with posting flattering remarks about Vladimir Putin and Moscow Mayor Sergei Sobyanin, with each worker expected to generate around 100 comments a day.

Just a few years later, the notorious Internet Research Agency, founded by Yevgeny Prigozhin, the late head of the Wagner Group, gained attention for allegedly establishing thousands of social media accounts to meddle in the 2016 U.S. presidential election.

Even after Prigozhin’s death in a plane crash in August 2023, the army of bots continued their activities. For instance, Roman persisted in commenting within Wagner-affiliated groups and on pages managed by Russian-backed officials in occupied eastern Ukraine.

In late October, following widespread power cuts in the Luhansk region, Roman attempted to cast a positive spin on the situation in online discussions, stating, “Walking around the house with candles is even romantic. I anticipate the lights will be restored soon.”

The day prior, he had commended the tree removal in Melitopol, an act likely to be unpopular among locals, but framed by Moscow-aligned authorities as part of an “urbanization” project.

On the VKontakte page of Melitopol’s Moscow-backed administration, numerous bots have left comments masquerading as local inhabitants. Thousands more comments have appeared on various government websites in occupied regions, with some praising the ruling United Russia party and others lauding Russian military efforts as they advance to seize more Ukrainian land.

Human rights organizations argue that the Kremlin has established near-total dominance over the media landscape in occupied Ukraine. Only journalists loyal to Moscow are granted permission to operate there, while others face risks of arrest, torture, or even death, as demonstrated by the case of Ukrainian journalist Viktoriia Roshchyna.

This information void enhances the effectiveness of bot operations, according to Vincent Berthier, head of the technology desk at Reporters Without Borders. He points out that the occupied territories have become «information black holes» where only Kremlin-approved media are permitted.

“Online bots do not create narratives; they leverage existing controversies to gain visibility and legitimacy,” Berthier explained to The Moscow Times in an interview. “Their influence is contingent upon the broader information landscape and the audience’s receptiveness.”

Nevertheless, Berthier noted that assessing the specific contribution of bots to the propaganda machine is challenging. “Disinformation rarely relies on a single tactic. Bots operate within coordinated systems that also encompass human-operated accounts and misleading news websites.”

On September 30 — the anniversary of Russia’s declared annexation of four Ukrainian regions — bot activity spiked. Numerous accounts shared congratulatory posts and suggested future territories for Russia to «liberate.»

“It would be great to have Kharkiv too, but only if a referendum is conducted. Obviously, no one is talking about coercion,” wrote an account called Vasilisa.

“I’m hopeful for a referendum in Odesa,” added another fake user in reply.

Bots also engage in conflicts with real users who are skeptical of the broad-scale invasion.

After Russia fired a Zircon hypersonic missile near NATO borders this year, a fabricated user named Dmitry entered a discussion to accuse a critic of ignoring what he termed Ukrainian assaults on the self-proclaimed People’s Republics of Donetsk and Luhansk — echoing Vladimir Putin’s justification for the 2022 invasion, claiming that Russia needed to safeguard the Russian-speaking population there from “genocide.”

Hours later, another bot joined in. Nadezhda, who portrays herself online as a “housewife and devoted spouse,” insisted that the Zircon launch was “just a military drill, nothing more — not a demonstration of Russia’s power.”

However, the discussion quickly dwindled. One genuine user in the thread had a browser extension that identifies bot activities on VKontakte, marking accounts as “bot/promoted.” Following this tagging, the fake accounts fell silent.