Updated
Updated · SCOTUSblog · May 14
Supreme Court Weighs Section 230 in X Child Pornography Case After 9-Day Takedown Delay
Updated
Updated · SCOTUSblog · May 14

Supreme Court Weighs Section 230 in X Child Pornography Case After 9-Day Takedown Delay

1 articles · Updated · SCOTUSblog · May 14
  • Thursday’s private conference will determine whether the justices take up Doe v. X Corp., a case asking them to revisit Section 230 immunity for platforms that leave child sexual-abuse material online after notice.
  • The petition centers on claims that Twitter, now X, refused to remove videos of two underage boys in 2020, telling one victim it found no policy violation and deleting the posts only 9 days later after Homeland Security was contacted.
  • Lower courts largely dismissed the suit under Section 230, though the 9th Circuit let some claims proceed over Twitter’s reporting duties and complaint-handling design because those did not stem from its role as a publisher.
  • The Does say immunity should not cover a platform that knowingly leaves criminal content up after notification, while X warns changing that rule would upend a Section 230 framework the digital economy has relied on for nearly 30 years.
  • The case gives the court another chance to narrow Section 230 after it declined to do so in its 2022-23 terrorism cases and later passed on a Snapchat design-liability dispute.
As AI generates more content, will the law that built the internet survive calls for accountability?
Is the Supreme Court adapting old laws for new challenges or creating entirely new rules?
How could a 19th-century law threaten modern medicine and telehealth nationwide?

Supreme Court Showdown: Section 230, X (Twitter), and the 1.5 Million AI-Linked CSAM Reports Reshaping Tech Accountability

Overview

A major Supreme Court case, John Doe #1 and John Doe #2 v. X (formerly Twitter), is challenging the broad legal immunity that social media platforms have under Section 230, especially when it comes to the sharing of Child Sexual Abuse Material (CSAM) online. The plaintiffs want to hold platforms like X accountable, arguing that companies often avoid responsibility when CSAM appears on their sites. Public figures like Tim Tebow are urging the Court to prioritize child safety over profit. This case could redefine Section 230’s boundaries and push platforms to take stronger action against harmful content.

...