The Rise of Data Processing Units (DPUs): Market Trends, Architecture, and the 2023 DPU Summit
Since Nvidia's CEO highlighted DPUs as the third core chip for data centers in 2020, the technology has rapidly advanced, reshaping semiconductor markets, introducing CPU‑DPU‑GPU task partitioning, and driving a booming ecosystem highlighted by the 2023 DPU Summit in Beijing.
In 2020, the semiconductor industry saw a new hotspot: the Data Processing Unit (DPU). After Nvidia’s CEO labeled it the "third main chip" for data centers, DPUs quickly rose to prominence, attracting widespread attention across the industry and society.
By 2023, three years of development have produced a new landscape: self‑developed and startup DPUs have made rapid progress, applications are beginning to materialize, and AI‑generated content (AIGC) services such as ChatGPT have driven demand for high‑performance computing, further accelerating DPU growth.
Top‑five global chip companies are all staking claims in the DPU arena, and market forecasts predict that by 2025 the global DPU market will exceed $24.53 billion (≈¥177.1 billion), indicating a leap‑frog growth phase.
The 2023 DPU Summit, held on August 4 in Beijing, gathered more than 30 senior representatives from the Chinese Academy of Engineering, telecom operators, traditional chip makers, and startups to share the latest DPU developments and practices.
From a computing‑architecture perspective, the traditional von Neumann model places the CPU at the center, handling all tasks—including networking, storage, encryption, and compression. However, this model is not always the most efficient; specialized chips like DPUs excel in I/O‑intensive scenarios, offering higher data‑transfer efficiency.
As generative AI, data science, autonomous driving, and the metaverse accelerate, massive compute demand turns compute power into a core production factor, prompting a division of labor: CPUs handle general workloads, GPUs accelerate parallel tasks such as graphics and deep learning, while DPUs offload security, networking, storage, and AI‑specific processing.
The article uses an analogy of a restaurant: initially a single “CPU‑chef” handles all duties, but as traffic grows, a dedicated “DPU‑cashier” and a “GPU‑waiter” are hired, allowing the chef to focus on cooking, thereby improving overall efficiency.
DPUs are positioned as peers to CPUs and GPUs, interacting to complete computing tasks, with DPUs specifically managing I/O processing and data transfer.
Current DPU implementations follow ASIC, FPGA, or SoC paths, each with trade‑offs in cost, programmability, and flexibility. Market solutions include Arm‑based architectures, FPGA+CPU hybrids, and SoC designs, with the latter seen as the future trend.
The primary DPU market is data centers, especially public cloud providers like Alibaba and Tencent, which are either developing or investing in DPU technology. Private‑cloud needs are more complex, requiring close alignment between DPU products and specific business scenarios.
Beyond data centers, DPUs are also relevant to intelligent driving, data communications, and network security. In autonomous vehicles, each on‑board computer can be viewed as a mini‑data‑center, generating substantial processing, forwarding, and storage demands that DPUs can address.
Overall, DPUs are still in an early stage, but the global market is expected to become a blue‑ocean in the coming years. Domestic DPU vendors must deepen cooperation with cloud providers and ecosystem partners to build a more open and robust DPU ecosystem.
The article concludes by promoting the third DPU Summit on August 4, organized by the China Communications Society and Jiangsu Future Network Innovation Research Institute, inviting interested fans to register for free.
Architects' Tech Alliance
Sharing project experiences, insights into cutting-edge architectures, focusing on cloud computing, microservices, big data, hyper-convergence, storage, data protection, artificial intelligence, industry practices and solutions.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.