Industry Insights 11 min read

Why Claude Crashed Seven Times and Anthropic Is Racing to Build Its Own AI Chip

Anthropic suffered seven major Claude outages in half a month, exposing a severe compute shortage that forced the company to announce an early‑stage, $5 billion AI‑chip project, revamp its pricing and subscription model, and confront regulatory KYC hurdles while the broader AI industry pivots away from Nvidia toward custom silicon.

DataFunTalk
DataFunTalk
DataFunTalk
Why Claude Crashed Seven Times and Anthropic Is Racing to Build Its Own AI Chip

01 Claude Outage Overview

In early April, Anthropic experienced a series of severe service disruptions affecting Claude, Claude Code, and its API, with error rates spiking across the board. The peak of the outage occurred around 10:42 AM EST, when roughly 6,000 users reported issues on Downdetector. Anthropic’s status page recorded three distinct high‑error periods and a three‑hour total downtime on April 15, marking the seventh documented interruption in just two weeks.

10:53 AM – Anthropic began investigating the error cause.

12:30 PM – Login success rates stabilized, and the team focused on resolving remaining issues.

1:50 PM – All systems were declared fully restored.

These repeated failures highlighted a chronic compute capacity shortfall, which Anthropic attributes to “unprecedented demand after major releases.” The company currently runs Claude on a heterogeneous mix of Nvidia GPUs, Google TPUs, and AWS Trainium chips.

02 Anthropic’s Chip Initiative

Facing mounting compute pressure, Anthropic disclosed plans to develop its own AI‑focused chip. Reuters reported a $5 billion entry‑fee for the project, which remains in an “extremely early” stage without a dedicated design team or finalized architecture. Industry benchmarks suggest that designing a cutting‑edge AI chip can cost $500 million to $1 billion and take three to four years from concept to volume production, as illustrated by Google’s TPU timeline (2013‑2018). Anthropic’s strategy mirrors moves by other AI giants—Meta’s MTIA, OpenAI’s partnership with Broadcom, and a joint 3.5 GW super‑computing effort with Google and Broadcom—signaling a broader shift toward custom silicon to reduce total cost of ownership (30‑50 % lower) and improve performance per watt by an order of magnitude.

03 Pricing and Subscription Changes

To offset soaring inference costs, Anthropic rolled out three measures in recent weeks:

Enterprise pricing overhaul: Claude Enterprise shifted from a pure subscription to a $20 monthly base fee plus usage‑based token charges, reducing the flat fee but exposing heavy users to potentially double or triple their previous costs.

Claude Code access gate: Users of Claude Code now must pay an additional fee for third‑party agents like OpenClaw, prioritizing internal API consumption.

Mandatory KYC enforcement: New real‑name verification requires government‑issued ID and a live selfie, effectively blocking many domestic users who cannot meet the criteria, resulting in lost conversation histories and project context.

These changes aim to align revenue with actual compute consumption, especially for high‑usage enterprise customers.

04 Future Outlook and Industry Context

Despite the aggressive pricing and chip initiatives, Anthropic’s self‑built silicon will not be ready until after 2027, leaving the company dependent on external providers for the foreseeable future. The ongoing outages serve as a stark reminder of the supply‑side ceiling in AI compute. Industry analysts note that while Anthropic’s valuation sits around $380 billion, with 70 % of enterprise first‑time AI purchases favoring Claude, the ultimate bottleneck remains hardware. Venture capital interest remains high, with the next funding round potentially valuing the firm at $800 billion.

In summary, Anthropic’s recent crises underscore the strategic imperative for AI firms to control their compute stack, but the path to self‑sufficiency is long and fraught with technical and regulatory challenges.

pricing strategyClaudeAI chipAnthropiccompute shortage
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.