Public Radio for Alaska's Bristol Bay
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

U.S. says Russian bot farm used AI to impersonate Americans

Russian state broadcaster RT broadcasting from near Red Square during the 2018 World Cup in Moscow. The Justice Department alleges an RT employee was behind an AI-powered effort to create fake social media profiles of Americans to spread Russian propaganda in the U.S.
Christopher Furlong/Getty Images
/
Getty Images Europe
Russian state broadcaster RT broadcasting from near Red Square during the 2018 World Cup in Moscow. The Justice Department alleges an RT employee was behind an AI-powered effort to create fake social media profiles of Americans to spread Russian propaganda in the U.S.

The U.S. Department of Justice said it disrupted a Russian propaganda campaign using fake social media accounts, powered by artificial intelligence, to spread disinformation in the U.S. and other countries.

The bot farm used AI to create profiles impersonating Americans on X, formerly known as Twitter, and to post support for Russia's war in Ukraine and other pro-Kremlin narratives.

It was part of a Kremlin-approved and funded project run by a Russian intelligence officer. The bot farm itself and the AI software behind it were organized by an unnamed editor at RT, the Russian state-owned media outlet, the Justice Department alleged.

Intelligence and security officials have been warning that Russia is ramping up propaganda efforts in a busy global election year, with the goals of undermining international support for Ukraine and discrediting democratic adversaries. The Kremlin has long used fake social media accounts to sow discord and advance its own interests.

Now, advancements in AI technology that allow people to quickly and easily generate realistic text, images, audio and video are raising concerns that the tools can be used to produce more propaganda and disinformation at scale. Recently Facebook owner Meta and OpenAI, the creator of ChatGPT, said they have identified foreign influence campaigns, including some linked to Russia, using AI in their efforts to manipulate the public.

"Russia intended to use this bot farm to disseminate AI-generated foreign disinformation, scaling their work with the assistance of AI to undermine our partners in Ukraine and influence geopolitical narratives favorable to the Russian government," FBI Director Christopher Wray said in a statement.

RT has pushed the Russian government's agenda to international audiences, including in the U.S., for years. But the outlet has lost some of its reach since Russia's 2022 invasion of Ukraine, which prompted the European Union to ban Russian state media and tech companies including TikTok, Facebook, and Google to limit access.

The DOJ said on Tuesday that RT has been looking for alternate distribution channels, and that the bot farm was part of those efforts.

Asked for comment on the allegations, RT's press office replied: "Farming is a beloved pastime for millions of Russians."

The DOJ says nearly a thousand fake profiles on X, formerly known as Twitter, were part of the Russian campaign. That included a user claiming to live in Minneapolis who posted videos of Russian President Vladimir Putin justifying Russian actions in Ukraine and claiming parts of Ukraine, Poland, and Lithuania were "gifts" from Russian forces after World War II.

X suspended the accounts for terms of service violations, the DOJ said. It's not clear how many people followed the fake accounts or interacted with their posts. X didn't respond to a request for comment.

The DOJ also seized two domain names used by the bot farm to create email accounts used to set up the fake X accounts.

Copyright 2024 NPR

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.