The Russia-linked threat actor CopyCop is trying to influence the upcoming U.S. presidential election using fake news websites and generative artificial intelligence, researchers have found. CopyCop is likely aligned with the Russian government and was created to spread manipulated political content at scale. Researchers first reported on this network in May when it mostly targeted political leaders in France, Ukraine, and the European Union. In a recent campaign, the threat actor narrowed its focus to the U.S. and the presidential election this November, according to a new report by researchers at Recorded Future’s Insikt Group. The Record is an editorially independent unit within Recorded Future. CopyCop has shifted to U.S.-based hosts for registering new websites, likely trying to minimize its connections to Russian infrastructure, researchers said. As of late May, former U.S. President Donald Trump and President Joe Biden are the most frequently mentioned people in CopyCop posts. The group highlights mistakes President Biden made during speeches, as well as, and criticizes the Biden administration’s failure to curb inflation. It is less critical of Trump, downplaying his conviction in a hush-money trial and saying that it will have no impact on the elections. To create manipulated politically-themed content quickly, CopyCop likely scrapes articles from other outlets — mostly U.S. conservative-leaning news organizations or Russian state-affiliated media — to plagiarize them using generative AI. “Such AI-generated content is unlikely to have as much impact on the elections as targeted content,” the researchers said. CopyCop has also expanded its use of AI to generate inauthentic journalist personas for its articles’ author profiles. Researchers found over 1,000 distinct author profiles and descriptions across the 120 new websites created by CopyCop in May. “AI-generated influence content allows influence actors like CopyCop to rapidly launder emerging narratives targeting the 2024 U.S. elections and obscure their origin, making it harder to attribute influence operations to foreign adversaries,” Recorded Future said. Given the threat from malign state actors using AI to undermine global elections this year, U.S. Deputy Attorney General Lisa Monaco said recently the Justice Department “will remain vigilant to foreign adversaries abusing AI to accelerate online hate and disinformation, imitate trusted sources of information, and proliferate deepfakes.” CopyCop is just one of many Russia-linked influence networks trying to spread misleading narratives about hot political topics such as the election in the U.S., the war in Ukraine, or protests in Georgia. Earlier this month, researchers discovered a campaign by the Russia-linked malign network Doppelgänger targeting American users on the social media platform X to discredit protests in Georgia sparked by an unpopular law that threatens the independence of local media. Some Russia-affiliated influence groups work together to promote each other’s content on social media. CopyCop’s articles, for example, were amplified by an inauthentic media outlet with ties to Doppelgänger, according to Recorded Future. Doppelgänger recently ran a social media influence campaign where it spread thousands of images with fake anti-Ukraine quotes falsely credited to celebrities, including Jennifer Aniston, Scarlett Johansson and Elton John.
Get more insights with the
Recorded Future
Intelligence Cloud.
No previous article
No new articles
Daryna Antoniuk
is a reporter for Recorded Future News based in Ukraine. She writes about cybersecurity startups, cyberattacks in Eastern Europe and the state of the cyberwar between Ukraine and Russia. She previously was a tech reporter for Forbes Ukraine. Her work has also been published at Sifted, The Kyiv Independent and The Kyiv Post.