The launch of GPT-5 did not make the impression that was expected. Just a few days after a lukewarm reception and harsh criticism, Sam Altman, CEO of OpenAI, shifted the conversation: GPT-6 is in development, and it promises to do better. Much better, in fact, if his statements are to be believed. Except that promises at OpenAI sometimes have a sense of déjà vu.
GPT-5, though highly anticipated, mainly confirmed a fear that some observers have been voicing for months: that progress is plateauing. Technical improvements, yes. Revolution, no. Its tone, considered cold, its distant personality, and its functional limitations provoked an immediate rejection among the most engaged users. The model was called a “disaster” on social media. Altman acknowledged “mistakes” in the rollout and tried to appease critics by emphasizing the emotional dimension of AI, which was too absent from the initial version.
It is in this climate of distrust that GPT-6 was mentioned. No release date, no demonstration, but with a speech already calibrated. Altman talks about a model that will remember users’ preferences, habits, even their personality. He insists: “people want memory.” GPT-6 will therefore integrate an extended contextual memory, able to adapt its responses over the long term, with personalization pushed to the point of reflecting each individual’s worldview. The AI could be “neutral,” “centered,” or “super woke” according to stated preferences. A flexibility which, according to him, meets user expectations. Except that just a few days earlier, he himself warned against the dangers of these overly malleable AIs, capable of fueling delusional spirals in vulnerable users...
The contradiction reveals a strategic tension: meeting the demand for personalization without crossing ethical red lines. Altman half-admits this. He mentions a minority of users unable to distinguish fiction from reality, but maintains that the majority can handle it. It’s a risky bet...
On the technical side, OpenAI claims it wants to overcome current structural limitations. The model router, criticized for its targeting errors, is being redesigned. The management of the context window, currently limited to 128,000 tokens for pro users, remains an aspect to improve. GPT-6 should be able to handle much larger volumes, allowing for much longer conversations or the analysis of massive data corpora.
As for architecture, hybrid models are being considered, meaning a combination of local and cloud processing, to improve speed, confidentiality, and allow offline use. The integration of an adaptive compute system, where lightweight models would handle simple requests while more powerful sub-models would take care of heavy cases, is also on the table. Modularity designed for professional uses and cost constraints.
But the most ambitious innovation is taking shape elsewhere. Altman mentioned a brain-machine interface, a potential competitor to Elon Musk’s Neuralink, so that users could “think” their requests. The project is still in its infancy, but it illustrates OpenAI’s desire to break free from the traditional framework of textual or vocal interaction. Ultimately, this type of interaction would raise massive privacy questions, which the company has barely addressed at this stage. The idea of data encryption was floated, without commitment.
Finally, GPT-6 is envisioned as a scientific tool. Not just an assistant or a conversation partner, but a potential engine of innovation in medicine or the environment. Again, the statements remain vague, the use cases unclear. All we know is that the ambition is to cross a functional threshold so that AI is no longer just a dialogue interface, but a research force.
Overall, it looks less like a structured plan than an attempt to regain control after a failure. GPT-5 disappointed, and Altman knows it. By betting on memory, radical personalization, neural interfaces, and scientific promises, OpenAI is trying to reignite enthusiasm. It will take more than words. Because if GPT-6 is not a clear break, user fatigue could set in. It’s already happening to some extent.