HyperAI Super Neural
Feb 11, 2026 · Artificial Intelligence
Reduce Memory by 75% Using D‑CHAG’s Cross‑Channel Hierarchical Aggregation
Researchers at Oak Ridge National Laboratory introduced D‑CHAG, a distributed cross‑channel hierarchical aggregation method that cuts memory consumption by up to 75% and more than doubles throughput when training massive multi‑channel foundation models on up to 1,024 AMD GPUs, as demonstrated on hyperspectral imaging and weather‑forecasting workloads.
D-CHAGDistributed Trainingfoundation models
0 likes · 14 min read
