Qwen’s Lead Architect Steps Down: Who Will Steer China’s Top Open‑Source AI Flagship?

On March 4, 2026, Alibaba’s youngest P10 technical leader Lin Junyang announced his resignation with a nine‑word tweet, just hours after releasing four Qwen 3.5 models that earned Elon Musk’s praise, while two other core researchers also left, leaving the future of China’s leading open‑source AI flagship uncertain.

AI Explorer
AI Explorer
AI Explorer
Qwen’s Lead Architect Steps Down: Who Will Steer China’s Top Open‑Source AI Flagship?

Midnight resignation tweet

On 2026‑03‑04 (Beijing time) Lin Junyang posted on X: "me stepping down. bye my beloved qwen." The nine‑word tweet was retweeted hundreds of times within hours, signaling a major change in the Chinese open‑source LLM community.

Professional background

Born in 1993, Lin earned a BSc in Computer Science at Peking University and later pursued graduate studies in linguistics. He joined Alibaba DAMO Academy in 2019 at age 26. His career progression:

2019 – NLP and recommendation‑system research.

2020 – Large‑scale pre‑training, contributing to projects such as M6 and OFA.

2022 – Appointed technical lead of the Qwen series.

2024 – Released open‑source Qwen models that competed with GPT and Claude.

2025 – Promoted to P10, the youngest such executive at Alibaba, and formed an embodied‑intelligence team.

2026‑03‑02 – Led release of four Qwen 3.5 small‑size models (0.8 B, 2 B, 4 B, 9 B).

His scholarly work has been cited over 42 k times on Google Scholar.

Team exodus

On the same day as Lin’s tweet, two core researchers also left:

Binyuan Hui – senior research scientist focusing on code models and mathematical reasoning.

Kaixin Li – researcher involved in multiple core Qwen capabilities.

No successors or destinations were disclosed, creating a leadership vacuum.

Qwen 3.5 small‑model release

On 2026‑03‑02 the team open‑sourced four Qwen 3.5 models covering parameter scales from 0.8 B to 9 B, intended for edge devices through server‑grade deployments. Elon Musk retweeted the announcement with the comment “Impressive,” providing a rare international endorsement.

Technical milestones

Since 2019 Qwen progressed from a single‑language model to multimodal, code‑generation, math‑reasoning, and visual‑perception capabilities. In several public benchmarks Qwen 3.5 outperformed the latest versions of GPT and Claude, demonstrating parity with leading commercial models. Lin publicly noted a compute gap of one to two orders of magnitude between Chinese and U.S. infrastructure.

Open‑source ecosystem impact

The Qwen ecosystem has accumulated more than 600 million downloads and over 170 k derivative models, indicating a self‑sustaining community.

Open questions

Who will assume technical leadership of the Qwen series?

What are the future plans of the departing researchers?

Can the Qwen project maintain its innovation momentum without Lin’s vision and external communication?

AlibabaLarge Language ModelsQwenopen-source AIChina AIElon MuskAI talent turnover
AI Explorer
Written by

AI Explorer

Stay on track with the blogger and advance together in the AI era.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.