Two NIRC Papers Accepted at NeurIPS 2024: FM-Delta Compression and GLAFF Forecasting

The Beijing University of Posts and Telecommunications' Network Intelligent Research Center (NIRC) had two papers accepted to NeurIPS 2024, presenting FM-Delta, a lossless compression technique that halves storage and cuts cloud costs by over 40%, and GLAFF, a global‑local fusion framework that markedly improves the robustness of time‑series forecasting across multiple domains.

Network Intelligence Research Center (NIRC)
Network Intelligence Research Center (NIRC)
Network Intelligence Research Center (NIRC)
Two NIRC Papers Accepted at NeurIPS 2024: FM-Delta Compression and GLAFF Forecasting

NeurIPS (Conference on Neural Information Processing Systems) is a top‑tier AI conference recommended by the China Computer Federation (CCF) with an H5 index of 337 and a 2024 acceptance rate of 25.8% (15,671 submissions). The Beijing University of Posts and Telecommunications' Network Intelligent Research Center (NIRC) secured two papers in the Main Track.

Paper 1 – FM-Delta: Lossless Compression for Storing Massive Fine‑tuned Foundation Models – The rapid growth of fine‑tuned large language models (LLMs) on platforms such as HuggingFace has increased the number of stored models from 30,000 to nearly 600,000 in two years, with over 90% being fine‑tuned variants. Although each fine‑tuned model contains billions of parameters, most of its parameters remain highly similar to its base pretrained model. The authors theoretically show that the parameter difference grows slowly as O(T^{1/4}) with the number of fine‑tuning steps T. FM-Delta exploits this by converting floating‑point parameters of both models to the same‑bit integers, subtracting them to obtain a sparse integer delta filled with many zero bits, and then entropy‑encoding the delta. Consequently, a cloud provider needs to store only one full pretrained model plus compressed deltas for its fine‑tuned variants. Experiments demonstrate roughly a 50% reduction in cloud storage consumption and a cost saving of at least 40%.

Cosine similarity between fine‑tuned and pretrained models
Cosine similarity between fine‑tuned and pretrained models

Paper 2 – GLAFF: Global‑Local Fusion for Robust Multivariate Time‑Series Forecasting – Time‑series forecasting is critical in finance, transportation, energy, climate, and healthcare, yet most existing methods rely solely on local observations and ignore the rich global information encoded in timestamps. The authors illustrate the problem with an hourly traffic‑flow series where local‑only models severely underestimate flow during holiday peaks. To address this, they propose GLAFF, a plug‑in framework consisting of three modules: (1) an Attention‑based Mapper that transforms timestamps containing global context into a standard‑normal distribution; (2) a Robust Denormalizer that inversely normalizes the mapped data to mitigate drift; and (3) an Adaptive Combiner that dynamically balances the global‑mapped prediction with the local forecast. GLAFF is model‑agnostic and can be integrated into any forecasting backbone. Empirical results on nine real‑world benchmark datasets spanning five domains show that GLAFF substantially enhances the robustness and accuracy of mainstream forecasting models.

GLAFF architecture diagram
GLAFF architecture diagram
Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Time Series ForecastingAI researchlossless compressionNeurIPS 2024fine‑tuned modelsFM-DeltaGLAFFglobal‑local fusion
Network Intelligence Research Center (NIRC)
Written by

Network Intelligence Research Center (NIRC)

NIRC is based on the National Key Laboratory of Network and Switching Technology at Beijing University of Posts and Telecommunications. It has built a technology matrix across four AI domains—intelligent cloud networking, natural language processing, computer vision, and machine learning systems—dedicated to solving real‑world problems, creating top‑tier systems, publishing high‑impact papers, and contributing significantly to the rapid advancement of China's network technology.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.