Ashley MacIsaac sues Google over false AI sex offender claims
Updated
Updated · The Guardian · May 5
Ashley MacIsaac sues Google over false AI sex offender claims
6 articles · Updated · The Guardian · May 5
The Canadian musician filed the C$1.5m case in Ontario superior court, seeking general, aggravated and punitive damages after Google’s AI Overview allegedly invented multiple criminal convictions.
The lawsuit says the false claims led Sipekne’katik First Nation to cancel a 19 December concert before later apologising, and left MacIsaac fearing for his safety and reputation.
MacIsaac alleges Google never contacted him or apologised; Google previously said AI Overviews are regularly improved when errors arise, and the feature now notes his legal action.
When an AI invents a crime and ruins a reputation, who is the real culprit?
As AI learns 'subliminally' like humans, can we ever truly make it safe?
The Ashley MacIsaac Case: AI Hallucination, Defamation, and the Fight for Accountability in Tech
Overview
In late 2025, Google's AI system mistakenly identified Canadian musician Ashley MacIsaac as a convicted sex offender by confusing him with another individual sharing his surname. This false information quickly spread, leading the Sipekne’katik First Nation to cancel his concert. After discovering the AI error, MacIsaac faced serious personal safety concerns, reputational damage, and financial loss. Despite the harm, Google did not issue a direct apology, prompting MacIsaac to file a $1.5 million defamation lawsuit in 2026, challenging AI liability laws. The case sparked global attention, driving governments to advance AI regulations and pushing the tech industry to improve safeguards, while raising important debates about trust and accountability in AI-generated content.