Updated
Updated · Financial Times · May 6
Runner gets incorrect marathon directions from ChatGPT
Updated
Updated · Financial Times · May 6

Runner gets incorrect marathon directions from ChatGPT

1 articles · Updated · Financial Times · May 6
  • On London Marathon day, the chatbot wrongly suggested routes from Paddington via Liverpool Street and later a non-existent Elizabeth line service to London Bridge for Blackheath.
  • The advice conflicted with Google Maps and Citymapper, which pointed to Charing Cross or Waterloo, and could have left the runner stranded amid race-related transport disruption.
  • The report argues that large language models can sound authoritative while being wrong, making users trust persuasive explanations over more reliable route-planning tools.
If a friendly chatbot can get you lost, why is it being integrated into our health, finances, and travel?
To make AI safer, should we design it to be less friendly and more like an obviously fallible tool?