Get PolitiFact in your inbox.
After Russia invaded Ukraine in February 2022, tech platforms enforced bans against Russian state media. But actors found ways to circumvent these bans and continue circulating Russian propaganda — often through websites posing local news.
As part of PolitiFact’s United Facts of America, Peter Benzoni, investigative data and research analyst on the Alliance for Securing Democracy’s information manipulation team, spoke to PolitiFact Executive Director Aaron Sharockman about the ways that Russia has influenced the U.S. media landscape.
Through a process Benzoni described as "information laundering," Russian-aligned forces move false or misleading information from a less credible sources to more credible ones, such as mainstream media outlets.
"Information laundering" isn’t new, Benzoni said. In 1983, an operation of the former Russian secret police and intelligence agency, KGB, planted a story in a Soviet-funded New Delhi paper. It was a conspiracy theory that the U.S. manufactured HIV at a biological research center in Fort Detrick, Maryland. The allegations gained international traction, landing on the "CBS Evening News," when anchor Dan Rather cited the claim and noted that the source offered "no hard evidence."
"You have this integration into the information ecosystem," Benzoni said, "wherein enough traction is created and has gone through enough laundering that the information is successfully given legitimacy and treated as though it came from a legitimate source."
In the digital era, Benzoni said information moves faster and reaches a wider audience than ever and often escapes scrutiny. That’s partly because it’s easy to create a website that looks legitimate — publishing one takes as little as "$20 and a WordPress subscription," Benzoni said.
In today’s digital ecosystem, he said, websites execute chains of citations, attributing the same information to different sources, leading to confusion about where information originated and making it hard to assess its authenticity.
"In some cases, this confusion is able to be exploited and result in the intentional obfuscation of the true starting point of information," he said. "We have to be very cognizant of the tactics used to hide this information and realize that it's used to make this disinformation appear more credible, or to avoid accountability."
Benzoni outlined some of the ways that content from state media circumvents bans such as the EU’s suspension of access to Russia Today or RT.
Mirror and alternative sites that simply replicate the banned content. These are generally controlled by the original site.
Aggregators, sites that compile news from different sources and tend to link back to their original source.
Copy-and-paste sites that lift and republish content without verification or attribution. Content from state media will be lifted and republished to third-party websites — including the U.S. right-wing conspiracy-oriented website InfoWars — reaching a global audience. "So they’re able to spread misinformation as though it was news for you, produced by them," he said.
He said some sites will try to evade bans by frequently rotating or switching out domains. He described it as "digital Whac-a-Mole." For example, analysts found 12 domains identical to RT, including some in different languages. The technique of switching out domains allows actors to bypass digital barriers, such as being blacklisted or being placed lower on Google Search results.
Benzoni gave an example of an RT story that had its content almost completely lifted and republished by a site named, "Little Rock AR News." An identical website, "Albuquerque Breaking News" also republished the RT story. He said a network of regional sites, national sites and faux local news sites targeting specific cities in the U.S. was eventually uncovered as these websites used the same set of registration information.
How can users detect these dubious sites? Benzoni shared a tool called The Disinformation Laundromat, a project by the Alliance for Securing Democracy, which people can use to discover linkages between websites through text matching and web forensics.
Here’s how it works: Users can enter a piece of content to identify other websites that posted copies of that content. The tool can also show whether the website is likely linked to other websites by analyzing identifiers like its domain name and its unique ID linked to its Google AdSense or publisher account.
The growth of artificial intelligence’s growth also poses challenges when it comes to identifying false websites, Benzoni said.
AI can be used to "clog up" detection systems, and can also be used to generate fake, but realistic, content, such as output generated from ChatGPT.
AI models can generate content that modifies a singular source by rephrasing text or altering images, making similarity detection across sites harder. AI can also help create dynamic websites that frequently and automatically refresh website elements such as content and structure.
Benzoni said that in the lead-up to the 2024 U.S. election, he expects forces that want to generate "undetectable news networks" to disseminate political messages will do so more easily given AI’s availability.
"So, you're able to borrow the legitimacy of local news sites to make a point that you could not otherwise make without being called out for it," he said. "I think you're gonna see a huge increase in volume and I think you're gonna see a huge increase in the types of people using these dissemination networks."
PolitiFact YouTube video, How Russia’s disinformation campaigns can target your town, Nov. 6, 2023
The MIT Press Reader, Lessons From Operation "Denver," the KGB’s Massive AIDS Disinformation Campaign, accessed Nov. 7, 2023
Los Angeles Times, AIDS: A GLOBAL ASSESSMENT : Soviets Suggest Experiment Leaks in U.S. Created the AIDS Epidemic, Aug. 9, 1987
Institute for Strategic Dialogue, RT Articles are Finding their Way to European Audiences – but how?, July 20, 2022
The Disinformation Laundromat, accessed Nov. 6, 2023