ElevenLabs

ElevenLabs’ AI Voices Allegedly Exploited in Russian Influence Effort

Artificial intelligence (AI) continues to revolutionize countless industries, but it’s also becoming a double-edged sword.

From mimicking human voices to creating persuasive content, tools like ElevenLabs’ voice generation software have made waves for their legitimate applications – and, alarmingly, for misuse.

A recent report highlights the potential involvement of AI-generated voices in Russian influence campaigns designed to undermine European support for Ukraine.

The Campaign: What Is “Operation Undercut”?

A Massachusetts-based threat intelligence firm, Recorded Future, recently uncovered a campaign tied to Russia.

Dubbed “Operation Undercut,” the effort primarily targeted European audiences with fake news videos aimed at sowing distrust in Ukrainian leadership and questioning the utility of Western military aid.

For instance, one video claimed, “Even jammers can’t save American Abrams tanks,” implying that sending advanced weaponry to Ukraine is futile. This narrative strategically attempted to discourage European countries from bolstering Ukraine’s defenses.

The Role of AI Voice Technology

What made these misleading videos stand out? According to Recorded Future, AI voice generation tools – likely including ElevenLabs’ software – were used to create professional-quality voiceovers in multiple languages.

By leveraging AI, the campaign aimed to eliminate accents or inconsistencies that might betray the content’s origins. The videos featured voiceovers in languages like English, French, German, and Polish, making them appear more credible and relatable to diverse audiences.

How Researchers Traced the Use of AI

Recorded Future’s investigators utilized ElevenLabs’ AI Speech Classifier, a tool that helps detect whether audio was generated using their software. Their findings confirmed a match.

While the report acknowledged the use of other AI voice tools, ElevenLabs was the only one specifically named.

Interestingly, the campaign also included videos with human voiceovers. These had detectable Russian accents, inadvertently highlighting the effectiveness of AI-generated voices in masking identities.

We read all the AI news and test the best tools so you don’t have to. Then we send 30,000+ profesionnals a weekly email showing how to leverage it all to: πŸ“ˆ Increase their income πŸš€ Get more done ⚑ Save time.

Who Is Behind the Campaign?

The report linked this operation to the Social Design Agency, a Russian organization sanctioned by the U.S. government earlier this year. This group reportedly managed over 60 websites that impersonated legitimate European news outlets.

These fake websites amplified the misleading narratives, often shared through bogus social media accounts.

The Broader Impact: Minimal but Concerning

While the campaign’s overall effect on European public opinion was limited, it underscores a growing concern: AI tools can be weaponized to spread misinformation quickly and at scale.

The ability to produce high-quality, multilingual content makes it easier to target diverse populations effectively.

ElevenLabs: Growth Amid Controversy

Founded in 2022, ElevenLabs has quickly become a leading name in AI voice technology. The company’s annual recurring revenue (ARR) skyrocketed from $25 million to $80 million within a year, and it’s now valued at approximately $3 billion.

Backed by notable investors like Andreessen Horowitz and former GitHub CEO Nat Friedman, ElevenLabs has positioned itself as an industry leader.

Safety Measures and Challenges

In response to past controversies – including a January 2024 incident where their technology was used in a robocall impersonating President Joe Biden – ElevenLabs has introduced new safety features.

These include automated tools to block unauthorized impersonations and human moderation teams to ensure compliance. The company’s policies explicitly prohibit “unauthorized, harmful, or deceptive impersonation.”

Why AI Voice Tech Needs Accountability

The misuse of AI voice technology in campaigns like “Operation Undercut” raises critical questions about accountability.

While companies like ElevenLabs have taken steps to mitigate misuse, the rapid evolution of AI means constant vigilance is required. Policymakers, tech companies, and users must collaborate to ensure these tools are used responsibly.

We read all the AI news and test the best tools so you don’t have to. Then we send 30,000+ profesionnals a weekly email showing how to leverage it all to: πŸ“ˆ Increase their income πŸš€ Get more done ⚑ Save time.

What’s Next for AI Voice Tech?

As AI voice generation tools become more sophisticated, their potential applications – both good and bad – will only expand. From enhancing accessibility to creating lifelike virtual assistants, the possibilities are endless.

However, incidents like this remind us that innovation must be paired with robust safeguards.

Key Takeaways

  • AI’s Double-Edged Sword: While tools like ElevenLabs’ voice generation software offer groundbreaking capabilities, they also present significant risks if misused.
  • Influence Campaigns: Operation Undercut highlights how AI can amplify misinformation in ways that are difficult to trace.
  • Accountability is Crucial: As the technology evolves, companies must prioritize safety measures to prevent misuse.
  • The Role of Collaboration: Governments, tech companies, and civil society need to work together to regulate and monitor AI advancements.
AspectDetails
Campaign NameOperation Undercut
AI Tool InvolvedLikely ElevenLabs and other commercial AI voice generators
Languages UsedEnglish, French, German, Polish, Turkish
Key ActorsSocial Design Agency (Russia-based, U.S.-sanctioned)
Target AudienceEuropean countries
Impact on Public OpinionMinimal but highlights risks of AI misuse

We read all the AI news and test the best tools so you don’t have to. Then we send 30,000+ profesionnals a weekly email showing how to leverage it all to: πŸ“ˆ Increase their income πŸš€ Get more done ⚑ Save time.