Operations 9 min read

How UCloud Achieved a PUE of 1.3: Green Data Center Strategies and Results

This article explains how UCloud reduced its data center Power Usage Effectiveness to around 1.3 through advanced cooling technologies, energy‑efficient UPS, modeling, and other green measures, while also reviewing international benchmarks and China’s regulatory targets for sustainable data centers.

UCloud Tech
UCloud Tech
UCloud Tech
How UCloud Achieved a PUE of 1.3: Green Data Center Strategies and Results

PUE Concept Overview

Power Usage Effectiveness (PUE) was introduced in 2006 by Microsoft’s Christian Belady to measure data‑center energy efficiency. It is the ratio of total facility energy to IT equipment energy, always greater than 1; the closer to 1, the more efficient the center.

International and Domestic Progress

Western operators such as Amazon, Google, and Microsoft have long prioritized natural cooling (air‑side, water‑side) and achieved PUE values around 1.25–1.15. Notable examples include Microsoft’s Dublin data center (PUE 1.25) and Facebook’s facility (PUE 1.15). In China, the “13th Five‑Year Plan” targets new large‑scale cloud data centers to have PUE ≤ 1.5 by 2018 and ≤ 1.4 by 2020.

Energy Consumption Analysis

Data‑center energy consumption consists of IT equipment, UPS conversion, cooling, lighting, fresh‑air, and auxiliary systems. Cooling accounts for the largest share of ineffective energy, making its optimization the most impactful.

Cooling Solution Research

We evaluated several cooling methods: direct‑expansion air‑cooling, air‑cooled water systems, water‑cooled water systems, direct fresh‑air cooling, indirect evaporative cooling, and liquid‑cooled rack systems. Each has distinct advantages regarding energy efficiency, water usage, environmental adaptability, and supply‑chain maturity.

Modeling and Parameter Optimization

Using operational data, we built a comprehensive energy model for the data center. Real‑time measurements allow us to calculate savings, compare with expectations, adjust parameters, and iteratively refine the model until the target PUE is achieved.

Additional Energy‑Saving Measures

Building Energy Efficiency : Envelope U‑values, window‑to‑wall ratios, and shading meet national standards; the main hall has no external windows to preserve thermal performance.

Water System Efficiency : Pumps are selected based on hydraulic calculations to operate at high efficiency, complying with national pump energy‑efficiency standards.

Cooling Plant Efficiency : High‑efficiency variable‑frequency centrifugal chillers (COP 7.1, IPLV ≥ 6.0, SCOP ≥ 4.4) are used, with automatic BAS control to match load and maximize natural cooling.

High‑Voltage DC Power : Reducing rectifier‑inverter stages cuts losses and component count, improving overall efficiency and lowering cost.

Heat Recovery : Water‑source heat pumps capture rack exhaust heat for winter heating and summer cooling, reducing auxiliary energy consumption.

Practical Results

The proposed design was first applied to the Wulanchabu data center, achieving a measured PUE of 1.3. This meets Shanghai’s 2019 guidelines, which require new data centers to keep PUE ≤ 1.4 in the first year and ≤ 1.3 thereafter.

Conclusion

UCloud demonstrates that combining advanced cooling, high‑efficiency power, and systematic modeling can deliver significant energy savings while complying with stringent regulatory standards, supporting sustainable “new‑infrastructure” development across China.

energy efficiencyPUEdata center operationsgreen data centerUCloudcooling systems
UCloud Tech
Written by

UCloud Tech

UCloud is a leading neutral cloud provider in China, developing its own IaaS, PaaS, AI service platform, and big data exchange platform, and delivering comprehensive industry solutions for public, private, hybrid, and dedicated clouds.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.