Policy Roast: Banning the Symptom While Missing the Disease
The EU added nudification tools to the AI Act ban list. Good. Now explain how you'll enforce it when the tools are free, open-source, and run locally.
Policy Roast: Banning the Symptom While Missing the Disease
The European Council just updated the AI Act to explicitly ban nudification tools - AI systems that generate fake nude images of real people without consent. This is unambiguously good policy. Non-consensual intimate images are harassment, full stop. The problem isn't the ban. It's the built-in assumption that banning something makes it go away.
Nudification tools aren't proprietary SaaS products run by traceable companies. They're open-source models you can download from GitHub and run on a gaming laptop. The people creating and distributing these tools aren't incorporated entities with compliance departments. They're pseudonymous developers in jurisdictions that don't care about EU regulations.
So: you've banned the tool. Now what?
The Enforcement Problem
Traditional regulation works when you have identifiable actors operating in the formal economy. You can fine companies, revoke licenses, threaten criminal penalties. This creates compliance pressure because the targets have something to lose.
But AI tools don't work that way. A nudification model is a file - a set of weights and architecture instructions. Once it's released, it's out there forever. You can ban the website hosting it. Another one pops up. You can ban the developer. They release under a new pseudonym. You can ban the specific model. Someone fine-tunes a new version.
The EU's ban on nudification tools is morally correct and practically unenforceable. It's the regulatory equivalent of declaring that gravity is illegal.
What They're Actually Regulating
When you can't regulate the tool itself, you regulate the adjacent infrastructure:
- Cloud providers - Ban nudification tools from AWS, Google Cloud, Azure. This stops commercial-scale services but does nothing about local deployment.
- App stores - Remove apps that facilitate nudification. Good for mobile, irrelevant for desktop users who can install whatever they want.
- Payment processors - Cut off Stripe, PayPal, and credit card networks from nudification services. Effective against monetized services, useless against free tools.
- ISPs and hosting - Require takedowns of sites distributing the tools. This works until the next mirror site launches in a non-EU jurisdiction.
Notice what's missing: any mechanism to stop someone from downloading the model file directly and running it on their own hardware. Because you can't regulate what happens on someone's personal computer. Not effectively. Not at scale.
The Real Policy Failure
The nudification ban addresses the tool instead of the harm. The problem isn't that the technology exists. The problem is what people do with it - creating non-consensual intimate images and distributing them as harassment or revenge.
That's already illegal in most jurisdictions. It's image-based sexual abuse. The EU doesn't need a new AI regulation for this. They need enforcement of existing laws against harassment, defamation, and revenge porn.
Here's what actual enforcement would look like:
- Fast takedown procedures - Mandate that platforms remove non-consensual intimate images within hours, not days. Make the penalty for non-compliance painful enough that platforms invest in detection.
- Criminal penalties for distribution - Prosecute people who create and distribute these images, not just the tools. Use existing cybercrime and harassment laws.
- Victim support infrastructure - Create streamlined processes for victims to get images removed, pursue civil claims, and access legal support without navigating a bureaucratic maze.
- Platform liability - Hold social media and messaging platforms accountable for hosting this content. If they profit from the network effects, they can invest in moderation.
The Missing Piece
Banning nudification tools might make policymakers feel productive, but it won't stop the harm. The tools will remain available. The motivated harassers will keep using them. And victims will still be left trying to get images removed from platforms that move slowly and care more about engagement than safety.
The EU's AI Act is groundbreaking in many ways. But adding nudification tools to the prohibited list without building the enforcement infrastructure to address the actual harm is policy theater. It's a press release pretending to be a solution.
If you want to stop non-consensual intimate images, regulate the distribution and penalize the harm. Banning the tool is the easy part. Enforcement is the work.