Updated
Updated · MIT Technology Review · May 14
Adult Creators Face 48-Hour Takedown Law as AI Exploits 10,000 Terabytes of Porn
Updated
Updated · MIT Technology Review · May 14

Adult Creators Face 48-Hour Takedown Law as AI Exploits 10,000 Terabytes of Porn

5 articles · Updated · MIT Technology Review · May 14
  • Adult performers say deepfakes and generative AI are exploiting their bodies and likenesses without consent, inflicting psychological harm and threatening income as fake nudes, cloned personas and scam accounts spread online.
  • More than 10,000 terabytes of online porn likely feed AI systems and nudify apps, experts say, yet creators usually cannot prove their work was used because training data remains a black box.
  • Copyright remains the main U.S. tool for removals, but it often fails when bodies are altered or anonymized; anti-piracy firms say they have fingerprinted over half a billion videos and removed 130 million from Google.
  • The Take It Down Act requires websites to remove nonconsensual intimate imagery within 48 hours, but lawyers warn bad actors could use it to falsely target legal porn and erase performers' work.
  • That risk adds to a broader legal gap: adult creators are often treated as the forgotten victims of deepfakes, with few clear rights over AI-generated copies trained on or imitating their images.
As new laws target deepfakes, why is the nonconsensual AI porn market continuing to expand?
Will AI duplicates save or destroy the careers of adult performers in the digital age?
Are tech giants enabling the deepfake crisis by promoting the very apps their policies claim to ban?

The TAKE IT DOWN Act and the Deepfake Crisis: How New Federal Law Targets Nonconsensual Intimate Imagery and AI Abuse in 2025

Overview

The TAKE IT DOWN Act is a major new federal law aimed at stopping the spread of nonconsensual intimate imagery, especially deepfakes. Spearheaded by Senators Ted Cruz and Amy Klobuchar, the Act received strong bipartisan support and passed quickly through Congress. Its core provision targets digital forgeries by making it illegal to knowingly publish deepfake images of adults without their consent. This broad consensus shows a unified federal effort to address the growing problem of maliciously created and shared intimate content, setting a nationwide standard to protect individuals from the harms of digital manipulation.

...