Pro-Russia social media accounts amplifying stories about divisive political topics such as immigration and campus protests over the war in Gaza.

Influence operations linked to Russia take aim at a disparate range of targets and subjects around the world. But their hallmarks are consistent: attempting to erode support for Ukraine, discrediting democratic institutions and officials, seizing on existing political divides and harnessing new artificial intelligence tools.

“They’re often producing narratives that feel like they’re throwing spaghetti at a wall,” said Andy Carvin, managing editor at the Atlantic Council’s Digital Forensic Research Lab, which tracks online information operations. “If they can get more people on the internet arguing with each other or trusting each other less, then in some ways their job is done.”

  • AutoTL;DRB
    link
    English
    45 months ago

    This is the best summary I could come up with:


    “They’re often producing narratives that feel like they’re throwing spaghetti at a wall,” said Andy Carvin, managing editor at the Atlantic Council’s Digital Forensic Research Lab, which tracks online information operations.

    Since the invasion of Ukraine, the European Union has banned Russian media outlets including RT, Sputnik, Voice of Europe and RIA Novosti from publishing or broadcasting within the bloc.

    That hasn’t stopped RT articles from proliferating across hundreds of other websites widely available in Europe, according to a recent report from the German Marshall Fund of the United States, the University of Amsterdam and the Institute for Strategic Dialogue.

    “We discovered RT articles reposted to third-party websites targeting audiences from Iraq to Ethiopia to New Zealand, often without any indication that the content was sourced from a Russian propaganda outlet,” the researchers wrote.

    Covert influence campaigns based in Russia, as well as in China, Iran and Israel, have begun using AI in their attempts to manipulate public opinion and shape politics, according to recent reports from OpenAI, Meta and Microsoft.

    A Russian operation that Microsoft calls Storm-1679 used AI to fake actor Tom Cruise’s voice narrating a phony Netflix documentary disparaging the International Olympic Committee.


    The original article contains 1,190 words, the summary contains 196 words. Saved 84%. I’m a bot and I’m open source!