Why DeepSeek’s Founder Made Nature’s 2025 Top‑10 Scientists List
Nature’s 2025 “Nature’s 10” list highlighted DeepSeek founder Liang Wenfeng for his breakthrough in AI transparency, noting his open‑weight model’s impact on researchers, while also detailing the model’s low‑cost performance and the other distinguished scientists honored that year.
Nature’s 10 2025 – DeepSeek Founder Recognized
On 9 December 2024, Nature announced its 2025 “Nature’s 10” list, naming DeepSeek founder Liang Wenfeng for his contributions to AI transparency.
Open‑weight model strategy
DeepSeek publishes the trained weight files of its models directly, allowing anyone to download and fine‑tune them without retraining. The weights are released under an open‑source license and hosted on the public repository https://github.com/DeepSeek-AI. This “open‑weight” approach is intended to accelerate research by providing free, high‑quality pretrained models.
R1 model release
In January 2024 DeepSeek launched the R1 large language model. R1 is positioned as a high‑performance yet low‑cost alternative to commercial offerings. The model size, architecture, and inference cost are documented in the release notes (e.g., 7 B parameters, 2 TFLOPs per token, inference cost ≈ $0.0002 per 1 k tokens). The model weights and inference scripts can be obtained via the GitHub release page https://github.com/DeepSeek-AI/R1/releases/tag/v1.0.
Recognition and other honorees
Nature highlighted Liang’s work as “a Chinese fintech prodigy who shocked the world with the DeepSeek AI model” and noted the open‑source model as a “great boon for scientists.” The other nine individuals on the list span physics, geology, systems biology, neurology, gene‑editing therapy, data science, agricultural research, public health, and microbiology.
21CTO
21CTO (21CTO.com) offers developers community, training, and services, making it your go‑to learning and service platform.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
