How the Take It Down Act tackles nonconsensual deepfake porn − and how it falls short

How the Take It Down Act tackles nonconsensual deepfake porn ...

In a rare bipartisan move, the U.S. House of Representatives passed the Take It Down Act by a vote of 409-2 on April 28, 2025. The bill is an effort to confront one of the internet’s most appalling abuses: the viral spread of nonconsensual sexual imagery, including AI-generated deepfake pornography and real photos shared as revenge porn.

Now awaiting President Trump’s expected signature, the bill offers victims a mechanism to force platforms to remove intimate content shared without their permission – and to hold those responsible for distributing it to account.

As a scholar focused on AI and digital harms, I see this bill as a critical milestone. Yet it leaves troubling gaps. Without stronger protections and a more robust legal framework, the law may end up offering a promise it cannot keep. Enforcement issues and privacy blind spots could leave victims just as vulnerable.

The Take It Down Act targets “non-consensual intimate visual depictions” – a legal term that encompasses what most people call revenge porn and deepfake porn. These are sexual images or videos, often digitally manipulated or entirely fabricated, circulated online without the depicted person’s consent.

The bill compels online platforms to build a user-friendly takedown process. When a victim submits a valid request, the platform must act within 48 hours. Failure to do so may trigger enforcement by the Federal Trade Commission, which can treat the violation as an unfair or deceptive act or practice. Criminal penalties also apply to those who publish the images: Offenders may be fined and face up to three years in prison if anyone under 18 is involved, and up to two years if the subject is an adult.

A growing problem

Deepfake porn is not just a niche problem. It is a metastasizing crisis. With increasingly powerful and accessible AI tools, anyone can fabricate a hyper-realistic sexual image in minutes. Public figures, ex-partners and especially minors have become regular targets. Women, disproportionately, are the ones harmed.

These attacks dismantle lives. Victims of nonconsensual intimate image abuse suffer harassment, online stalking, ruined job prospects, public shaming and emotional trauma. Some are driven off the internet. Others are haunted repeatedly by resurfacing content. Once online, these images replicate uncontrollably – they don’t simply disappear.

In that context, a swift and standardized takedown process can offer critical relief. The bill’s 48-hour window for response has the potential to reclaim a fragment of control for those whose dignity and privacy were invaded by a click. Despite its promise, unresolved legal and procedural gaps can hinder its effectiveness.

NBC News gives an overview of the Take It Down Act.

Blind spots and shortfalls

The bill targets only public-facing interactive platforms that primarily host user-generated content such as…

Access the original article

Subscribe
Don't miss the best news ! Subscribe to our free newsletter :