SGLang Spins Out as RadixArk with $400M Valuation Amid Inference Infrastructure Boom
SGLang, the open‑source inference accelerator, has been spun out into RadixArk—a $400 million‑valued startup aiming to democratize AI infrastructure, while the broader market sees a surge of funding for inference‑focused companies.
SGLang, an open‑source tool that speeds up AI model inference, has been spun out into a commercial entity called RadixArk. The spin‑out was led by Accel in a new financing round that valued the company at about $400 million, notable for a startup that debuted only last August.
The project originated in 2023 in the Berkeley lab of Databricks co‑founder Ion Stoica. Core contributor Ying Sheng, formerly an engineer at xAI and a research scientist at Databricks, left Elon Musk’s AI startup to become co‑founder and CEO of RadixArk.
RadixArk’s mission is to make cutting‑edge AI infrastructure open and accessible. It is developing two core products: the continued evolution of the SGLang inference engine and Miles, an open‑source framework for large‑scale reinforcement‑learning training. The company claims these tools can reduce the cost of building, training, and running frontier models by tenfold and increase accessibility by the same factor.
This spin‑out follows a similar trajectory as the more mature inference‑optimization project vLLM, which also emerged from Stoica’s lab. vLLM recently raised over $160 million at a roughly $1 billion valuation, with Andreessen Horowitz leading the round.
A partner at CRV noted that SGLang’s popularity has surged in the past six months, with several large technology firms already using the tools for inference workloads. Although the core software remains free, RadixArk has begun charging for its hosted services.
The broader market for inference‑infrastructure startups has been very active. Baseten secured $300 million at a $5 billion valuation, and Fireworks AI raised $250 million at a $4 billion valuation, as reported by the Wall Street Journal. The article observes that packaging open‑source tools as companies has become a standard approach, and questions whether these firms can truly democratize AI infrastructure or will primarily serve well‑funded customers.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
AI Engineering
Focused on cutting‑edge product and technology information and practical experience sharing in the AI field (large models, MLOps/LLMOps, AI application development, AI infrastructure).
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
