DeepSeek V4 Unveiled: 1.6 T Parameters, Million‑Token Context, Fully Open‑Source
DeepSeek V4 introduces two open‑source MoE models—Pro and Flash—with up to 1.6 T parameters, 1 M token context, a new DSA sparse‑attention mechanism, extensive benchmark results, and a tiered pricing scheme, while remaining compatible with OpenAI and Anthropic APIs.
