Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

OpenAI uses its tools to shut down influence networks in Russia and China

Reading Time: < 1 minute

OpenAI, a leading artificial intelligence company, has revealed that it has shut down five covert influence operations in the past three months. These networks, located in countries such as Russia, China, Iran, and Israel, were using OpenAI’s products to manipulate public opinion and shape political outcomes while concealing their true identities.

In a recent report, OpenAI highlighted how these influence networks utilized AI tools to generate text and images at a larger scale and with fewer errors than human-generated content. Despite their efforts, the company stated that these campaigns ultimately failed to significantly increase their reach.

Ben Nimmo, principal investigator on OpenAI’s Intelligence and Investigations team, emphasized the importance of addressing the potential risks associated with AI-powered influence operations. He stated, “With this report, we really want to start filling in some of the blanks.”

The identified covert operations, including groups like “Doppelganger” and “Spamouflage,” used OpenAI’s AI models to automate posting on platforms like Telegram and generate comments to sway public opinion. OpenAI also noted that some of the AI-generated content contained common error messages, allowing them to identify and defend against such operations.

While these influence networks may not have gained widespread traction, OpenAI stressed the need for continued vigilance. Nimmo warned, “History shows that influence operations which spent years failing to get anywhere can suddenly break out if nobody’s looking for them.”

OpenAI’s efforts to combat covert influence operations align with similar disclosures made by companies like Meta Platforms Inc. The company plans to share more reports in the future and collaborate with industry peers to address the growing threat of AI-powered manipulation.

Taylor Swifts New Album Release Health issues from using ACs Boston Marathon 2024 15 Practical Ways To Save Money