The Clone Wars: A New Congress Reconsiders the NO FAKES Act to Combat Digital Deepfakes

April 29, 2025

In April 2023, vocal deepfakes went viral as the song “Heart on My Sleeve” garnered millions of hits across popular music platforms. Unbeknownst to listeners, however, the voices of Drake and The Weeknd were entirely AI-generated, complete with an unauthorized producer tag from Metro Boomin. The song was live for two weeks before Universal Music Group (“UMG”) took it down via a Digital Millennium Copyright Act (“DMCA”) takedown.

While UMG may have won the battle, the war with artificial intelligence (“AI”) and vocal cloning continues. Towards the end of 2024, numerous public and private stakeholders weighed in on the issue, discussing the pros and cons of national protections against the use of AI. By the end of the year, the Nurture Originals, Foster Art, and Keep Entertainment Safe Act (the “NO FAKES Act”)—intended to protect “rights in the voice and visual likeness of individuals”—garnered considerable bipartisan support. If the law had passed, it would have provided a national framework to protect individuals and provide tools to effectively fight back against digital deepfakes. But by the end of the year, the proposed bill failed to receive final legislative approval.

Without a national standard for protection, vocal deepfakes have continued to run rampant across the Internet and social media. Anyone can be susceptible to the risk of AI vocal cloning, including politicians, voice actors, music artists, and even everyday citizens. With even a small sample of a person’s voice, AI tools can effectively imitate the sounds and tone of your voice to recreate (and even generate) content that sounds like the original person. This can have a devastating impact when vocal deepfakes imitate a person saying scandalous, inflammatory, or problematic things—which the actual person never said. As just one example, the ElevenLabs tool was likely used to create a vocal deepfake of then-presidential candidate Joe Biden discouraging voters from participating in the New Hampshire primary elections.

Unfortunately, the legal protections across the various states are patchwork at best. For the majority of states, current laws protecting name, image, and likeness (“NIL”), and a person’s general right to publicity, are insufficient to protect against this entirely new threat. And such abuses have proven difficult (if not impossible) to track, let alone to challenge in court. Only a handful of states (like Tennessee, which passed the ELVIS Act to protect musicians) have started to address the threat of AI deepfakes.

But there is good news too. In April 2025, a new Congress reintroduced the NO FAKES Act, which would provide uniform and nationally recognized protections against vocal cloning and digital deepfakes. Among the various aspects of the bill, the proposed law would give individuals the power to subpoena various online services for information or data related to the deepfakes. So too would the bill allow victims to initiate notice-and-takedown proceedings like the DMCA.

As the vocal Clone Wars continue, support appears to be growing for Congress to pass the NO FAKES Act and provide individuals (including voice actors and music artists) with tools to fight back effectively against AI-generated deepfakes. Both private entities and public stakeholders continue to wrestle with the best way to regulate the use of AI, and state and federal legislatures continue to assess various bills. Anyone facing issues with vocal cloning or digital deepfakes is encouraged to consult with experienced counsel as the legal landscape continues to evolve.

 

Timothy J. Miller (LAW ’23) is an attorney in the Philadelphia office of Blank Rome LLP and is a member of the firm’s General Litigation Practice and Trade Secrets & Competitive Hiring groups.

Jeffrey N. Rosenthal is a Partner in the Philadelphia office of Blank Rome LLP where he co-leads the firm’s Biometric Privacy Team and is a member of its Privacy Class Action Defense and Cybersecurity & Data Privacy groups.

Leave a Comment