In a first, OpenAI removes influence operations tied to Russia, China and Israel : NPR
From NPR: 2024-05-31 06:57:48
OpenAI recently revealed the takedown of influence operations from Russia, China, Iran, and Israel. These bad actors have been using AI tools like ChatGPT to spread fake news on social media platforms. While these operations didn’t gain significant traction, the potential for AI-generated content to manipulate elections remains a concern globally.
The boom in generative artificial intelligence poses new threats to election integrity, as fraudsters use AI to create convincing fake audio, video, images, and text. OpenAI has already banned accounts associated with five covert influence operations, including Russia’s Doppelganger and China’s Spamouflage, both known for spreading propaganda online. These operations used AI to generate multilingual comments and manipulate content across platforms.
Meta and OpenAI have teamed up to disrupt covert influence operations, including one traced back to a political marketing firm in Tel Aviv called Stoic. Fake accounts associated with Stoic posed as different identities and posted content related to the war in Gaza and other political issues. These operations highlight the need for continued vigilance in the face of rapidly advancing technology for manipulating public opinion.
Read more at NPR: In a first, OpenAI removes influence operations tied to Russia, China and Israel : NPR