Tagged articles
1 articles
Page 1 of 1
SuanNi
SuanNi
May 12, 2026 · Artificial Intelligence

AntAngelMed: 6.1B‑Activated MoE Model Tops Three Medical Benchmarks

AntAngelMed, a 100‑billion‑parameter medical LLM using a 6.1 billion‑parameter MoE architecture, achieves performance comparable to a 40 billion‑parameter dense model, exceeds 200 tokens/s inference speed, and ranks first on HealthBench, MedAIBench and MedBench, with a three‑stage training pipeline and extensive efficiency optimizations.

HealthBenchMedAIBenchMedBench
0 likes · 6 min read
AntAngelMed: 6.1B‑Activated MoE Model Tops Three Medical Benchmarks