OpenAI’s Rapid Sprint: GPT‑5.6 Leaked and a $400 Subsidy to Oust Claude Code
Within three weeks of GPT‑5.5’s launch, internal code for GPT‑5.6 surfaced, prompting OpenAI to unveil a 2‑3× “ultrafast” mode and a two‑month free Codex offer worth $400 to lure Claude Code users, sparking a high‑speed AI competition with Anthropic’s Opus 4.7 Fast and highlighting a self‑reinforcing acceleration loop toward ASI.
01 OpenAI Accelerates, Codex to Triple Speed
Leaked internal checkpoints show that development of GPT‑5.6 entered full‑speed mode only days after GPT‑5.5’s release. OpenAI plans to roll out an “ultrafast” mode that boosts response speed by 2–3× for latency‑sensitive tasks. Earlier, the /fast mode in Codex during GPT‑5.4 already delivered a 1.5× speedup, and GPT‑5.3‑Codex‑Spark, running on Cerebras chips, reached over 1,000 tokens / s—about 15× the normal mode. The new ultrafast mode applies the same multiplier to the flagship model, not a stripped‑down version, promising dramatic gains for agent loops, long‑task pipelines, and browser automation.
OpenAI’s rollout follows a pattern of rapid iteration: after GPT‑5.4’s release in March, Codex’s /fast mode delivered a 1.5× boost; GPT‑5.3‑Codex‑Spark later achieved a 15× increase. The upcoming ultrafast mode therefore represents a continuation of this acceleration trend.
02 Full‑Scale Battle: Codex vs Claude Code
Anthropic pre‑emptively launched Opus 4.7 Fast on June 15, offering faster advanced reasoning, longer context encoding, and smoother “atmosphere” coding than Codex. Shortly after, OpenAI announced a direct counter‑move: any enterprise switching from Claude Code to Codex within the next 30 days receives two months of free usage (valued at $400 at the Pro $200 / month rate). This aggressive subsidy was highlighted by a prominent community figure who called Codex the strongest AI programming product on the market.
Within three hours of the announcement, 2,000 developers contacted OpenAI, indicating strong demand for the migration incentive. The competition has been framed as a “battle” that benefits developers, who stand to gain from faster, more capable AI coding tools.
When Iteration Speed Approaches ASI
The article argues that AI self‑acceleration and commercialization form a positive‑feedback flywheel. GPT‑5.3‑Codex was the first model to “participate in its own training,” and by GPT‑5.5, 85 % of OpenAI staff use Codex weekly. GPT‑5.6 is likely being developed with deep internal involvement, illustrating the loop where AI builds stronger AI.
Simultaneously, AI‑assisted programming tools are scaling: Codex has three million weekly active users, and Claude Code’s user base is exploding. As millions of developers adopt these tools, AI‑generated code feeds back into model training and deployment, further accelerating the cycle. Anthropic’s and OpenAI’s subsidy wars amplify this momentum, pushing the industry toward artificial superintelligence (ASI).
Overall, the rapid release cadence, aggressive pricing tactics, and the growing reliance on AI coding assistants suggest a self‑reinforcing acceleration that could bring the field ever closer to ASI.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
