Updated
Updated · O'Reilly Media · May 14
Author Urges Externalized AI Context Files After 15 Million-Token Session Hit Limits
Updated
Updated · O'Reilly Media · May 14

Author Urges Externalized AI Context Files After 15 Million-Token Session Hit Limits

3 articles · Updated · O'Reilly Media · May 14
  • A 15 million-token Copilot session that ran for hours led the author to argue developers should externalize AI context to files instead of relying on a single chat’s memory.
  • Context windows are finite, and once compaction or a fresh session wipes prior state, AI tools can keep producing plausible code while silently losing requirements, design decisions and bug history.
  • The proposed fix is to make AI write and update files such as DEVELOPMENT_CONTEXT.md and subsystem CONTEXT.md during work, so new sessions can bootstrap quickly and stay aligned.
  • Those files should capture not just decisions but the reasoning behind them; the author says missing “why” details lets later sessions undo deliberate choices or weaken quality standards.
  • The broader claim is that effective AI development depends less on prompt tricks than on disciplined context management—balancing what to persist, what to keep active and what to drop.
Is 'context engineering' just a temporary hack, or a fundamental new skill for the AI-driven developer?
AI promised to accelerate coding, so why are studies showing it's actually slowing experienced developers down?