How AI Influencers and Russian Disinformation Propel the Far-Right in Germany’s Election Landscape
Voters in Germany are increasingly confronted with an influx of far-right narratives disseminated online, stemming from AI-generated content and Russian disinformation initiatives. Experts monitoring social media have identified Russian-based groups, including “Doppelganger” and “Storm-1516,” which were previously found to be active in American elections. As Germany approaches its election for a new Bundestag this Sunday, these campaigns leverage artificial intelligence to amplify their messages. Notably, the far-right party Alternative for Deutschland (AfD) has outperformed other parties in social media engagement throughout the campaign and currently ranks second in opinion polls.
Techniques employed include the fabrication of fake TV news reports and deepfake videos featuring fictitious “witnesses” or “whistleblowers” who concoct stories about well-known politicians. For instance, prior to the snap election in November 2024, a video circulated claiming that a prominent pro-Ukraine parliamentary member was actually a Russian spy. Dr. Marcus Faber, head of the government’s defense committee and a member of the Free Democratic Party, was specifically targeted in a video that used AI to suggest a former adviser was making the allegation. When approached for a comment, Dr. Faber was unavailable to respond.
In another instance, an 18-year-old woman falsely accused a German minister of child abuse in a video generated by AI. A recent investigation by the Center for Monitoring, Analysis, and Strategy (CeMAS)—a non-profit think tank that focuses on analyzing disinformation and right-wing extremism—along with Alliance 4 Europe, which combats digital disinformation, has linked these stories to the Russian disinformation campaign Storm-1516. The researchers have also monitored the Doppelganger campaign, which is operated by the Russian PR agency Social Design Agency, noted for its connections to the Kremlin. They discovered that the group’s primary strategy involves producing fake news articles that mimic reputable publications, subsequently disseminated through a network of social media accounts. Often, these posts appear to originate from concerned citizens, with messages like: “I am concerned that aid to Ukraine will impact our ability to invest in our own infrastructure and social security systems.”
One such post directed readers to a fake article that criticized Germany’s financial support for the war in Ukraine, appearing on a phony website that mimicked the German newspaper Der Spiegel. Julia Smirnova, a senior researcher for CeMAS, remarked, “Different Russian campaigns aim both to undermine established parties and to bolster the far-right AfD.” She emphasizes, “This isn’t just about one fake video or article; it represents a systematic effort to continuously inundate the public with a stream of false narratives and propaganda.”
Between mid-December 2024 and mid-January 2025, CeMAS identified 630 German-language posts exhibiting typical Doppelgänger characteristics on social media platform X alone. Cybersecurity policy adviser Ferdinand Gehringer noted that Russian interference in online discourse is to be expected. “Russia has clear objectives to manipulate our public opinion and further its own agenda,” he stated. He pointed to the AfD’s proposals to halt shipments of arms to Ukraine and increase imports of Russian gas as indicative of Russia’s interest in fostering cooperation.
Additionally, CeMAS discovered a case where false information from a Russian disinformation campaign was propagated by an AfD politician. Member of Parliament Stephan Protschka shared on social media that the Green Party collaborated with Ukraine to recruit individuals for criminal activities to discredit the AfD—an assertion linked to Russian propaganda. Sky News attempted to reach Mr. Protschka for commentary but received no response. Efforts to contact the Social Design Agency regarding the Doppelganger group’s allegations were also unsuccessful, and attempts to engage members of the Storm-1516 campaign went unanswered.
Beyond the chaotic sphere of Russian-driven disinformation, far-right factions in Germany are intensifying their online engagements. A notable example is Larissa Wagner, an AI-created social media influencer. In a video posted to her X account on September 22, 2024, coinciding with the Brandenburg state election, she stated, “Hey guys, I’m just on my way to the polling station. I’m daring this time. I’m voting for AfD.” Her accounts on Twitter and Instagram, launched in the past year, feature regular videos promoting far-right messages, including comments directed at Syrian immigrants to “pack your bags and go back home.” She claimed to have interned with the right-wing magazine Compact, which was banned by the German government last year.
It remains unclear who is behind Larissa’s creation. When approached by Sky News on Instagram, she responded, “I think it’s completely irrelevant who controls me. Influencers like me are the future… Like anyone else, I want to share my perspective on things. Every influencer does that. But because I’m young, attractive, and right-wing, it’s framed as ‘influencing the political discourse.'” According to Ferdinand Gehringer, the radicalism of her posts has escalated over time. “The potential for influence is substantial—especially since the presence of a young, attractive woman enhances audience engagement,” he noted.
The far-right’s adaptation of generative AI on social media extends beyond figures like Larissa. A recent report from the Institute for Strategic Dialogue revealed that 883 posts since April 2023 incorporated images, memes, and music videos produced with generative AI, stemming from both AfD supporters and the party itself.
In October alone, party accounts published over 50 posts featuring generative AI content. The Alternative for Germany (AfD) party is reportedly utilizing AI more extensively than other political groups. Pablo Maristany de las Casas, an analyst at the Institute for Strategic Dialogue and co-author of a relevant report, states, “They are clearly the one actor that is exploiting this technology the most.” The far-right narratives analyzed fall into two primary categories: those portraying migrants as violent criminals through AI-generated imagery, and narratives that celebrate traditional German values. When these themes intersect, “the far-right community feels more unified in the so-called cultural fight against these targeted groups,” explains Maristany de las Casas.
A notable example is the “Remigration Song,” a project that features a music video created by the now-defunct youth wing of the AfD, promoting the mass deportation of immigrants, termed remigration. This type of home-grown content is raising concerns among experts about its potential to influence public perception. A recent survey by the Bertelsmann Foundation, a think tank advocating for social reform, revealed that 80% of Germans view disinformation on the internet as a significant societal issue. Furthermore, 88% of respondents believe that disinformation is disseminated with the intent to sway political opinions. Senior researcher Cathleen Berger notes, “Just the foreign information itself is probably not going to shift attitudes. I think the impact only comes when it is being picked up by domestic actors.”