Thinking Machines Unveils Real-Time AI Interaction Models, Eyes Wider Release Later in 2026
Updated
Updated · The Verge · May 11
Thinking Machines Unveils Real-Time AI Interaction Models, Eyes Wider Release Later in 2026
2 articles · Updated · The Verge · May 11
Thinking Machines said its new “interaction models” continuously process audio, video and text, letting AI think, respond and act in real time rather than waiting for a user to finish.
The startup said the system is designed to fix a human-AI “bandwidth bottleneck” in which current models perceive and generate in a single thread, limiting collaboration and context.
Demo uses included spotting animal mentions in a story, translating speech live and alerting a user when they were slouching.
A limited research preview is planned in the coming months, with a wider release targeted later this year.
Mira Murati founded Thinking Machines in February 2025 after leaving OpenAI, and the lab has already faced notable staff departures to Meta and back to OpenAI.
Can Thinking Machines solve the 'AI memory wall' to make its real-time interaction models commercially viable at a $50 billion valuation?
Will AI with 'continuous perception' become a seamless collaborator, or will it be the ultimate form of digital intrusion into our lives?
How will we govern proactive AI 'colleagues' that can act on their own, moving faster than any human oversight?
Thinking Machines has introduced its 'Interaction Models,' a major leap in AI that enables real-time, multimodal interaction by processing audio, video, and text at the same time. Built on a multi-stream, micro-turn approach, these models move beyond traditional, turn-based systems to deliver seamless, low-latency experiences that mimic natural human conversation. By making interactivity a core feature, the models are designed to become smarter and more collaborative as they scale. This innovation positions Thinking Machines to stand out in the competitive AI landscape, offering more natural and intuitive AI engagement for users.