AntTech
AntTech
Sep 13, 2025 · Artificial Intelligence

LLaDA‑MoE: The First Native MoE Diffusion Language Model Shattering Autoregressive Limits

Ant Group and Renmin University unveiled LLaDA‑MoE, the industry’s first native MoE‑based diffusion language model trained on 20 TB of data, achieving performance comparable to Qwen2.5 while delivering several‑fold faster inference, and the model will be fully open‑sourced to accelerate global AI research.

AI researchDiffusion Language ModelLLaDA-MoE
0 likes · 6 min read
LLaDA‑MoE: The First Native MoE Diffusion Language Model Shattering Autoregressive Limits