Sweeping new legislation forces tech platforms to swiftly remove nonconsensual deepfake images, signaling a new era of accountability and victim protection in the age of AI-powered threats.
Image Credit: Lightspring / Shutterstock
The passage of the Take It Down Act, announced this week by President Trump, is a significant milestone in addressing the growing issue of nonconsensual deepfake pornography, a problem amplified by advancements in AI technology, according to Yu Chen, a professor at Binghamton University, State University of New York.
Chen, whose research focuses on AI-generated content authentication and deepfake attacks detection, sees several key points worth considering:
1. The law targets the nonconsensual publication of sexually explicit images, including those generated by AI, which is critical given the increasing accessibility of AI tools that create realistic deepfakes. By criminalizing the publication or threat of publication of such content and mandating platforms to remove it within 48 hours, the legislation provides a legal framework to protect victims and hold perpetrators accountable.
2. The overwhelming bipartisan support (409-2 in the House, unanimous in the Senate) reflects a rare consensus on the urgency of this issue. This broad coalition, including tech giants like Meta and Google, underscores the societal recognition of deepfakes as a serious threat.
3. As one of the first major U.S. laws directly tackling AI-generated content, the Take It Down Act sets a precedent for regulating AI's societal impacts. However, it also highlights the complexity of balancing victim protection with privacy, free expression, and technological innovation. To researchers like myself, it is valuable to study how this law is implemented, particularly how the FTC navigates enforcement and whether the law withstands anticipated legal challenges on First Amendment grounds.
4. Questions remain about how platforms will verify nonconsensual content, how victims will navigate the takedown process, and whether the law will deter the creation of deepfake tools.