How a 0.5 MB AI Model Tackles Global Supply‑Chain Challenges: Li‑Net Technology and Applications

The article presents Li‑Net, a 0.5 MB lightweight time‑series model co‑developed by SF Technology and universities, accepted at ICDE 2026, which overcomes multi‑channel, non‑stationary, multimodal forecasting difficulties, achieves state‑of‑the‑art accuracy with low latency, and is deployed across SF’s global logistics to improve demand, inventory and capacity planning while cutting costs.

DataFunSummit
DataFunSummit
DataFunSummit
How a 0.5 MB AI Model Tackles Global Supply‑Chain Challenges: Li‑Net Technology and Applications

Paper background and ICDE acceptance: Li‑Net was jointly created by SF Technology, Zhejiang University, and Zhejiang University of Technology, focusing on industrial‑grade multi‑channel time‑series prediction. The work was published at the ICDE 2026 conference, marking a rare combination of academic innovation and real‑world logistics deployment.

Industry pain points and challenges: Global supply chains suffer from three major issues: (1) coupling of multiple variables, (2) strong non‑stationary fluctuations, and (3) difficulty integrating multimodal data. Conventional models exhibit channel‑wise bias, O(n²) attention complexity, and oversized SOTA models that cannot run on edge devices, limiting real‑time prediction and large‑scale node scheduling.

Technical innovation and methodology: Li‑Net adopts an Encoder‑Processor‑Decoder pipeline. It replaces simple feature concatenation with multimodal embeddings that guide attention. A dual‑dimensional (time + channel) Top‑K sparse attention yields linear‑time computation. The backbone relies on a lightweight MLP, enabling high precision under low‑compute constraints.

Model performance and results: Across five datasets and 24 configurations, Li‑Net achieved 20 SOTA records, with an average MAE of 0.3443, substantially surpassing iTransformer, PatchTST, and TFT. The model size is only 0.5 MB, inference latency ranges from 0.4 s to 0.56 s, and training memory usage is minimal, delivering an optimal balance of accuracy and efficiency.

Deployment and enterprise value: Li‑Net has been rolled out at scale in SF’s global supply‑chain scenarios, including sales forecasting, inventory optimization, and capacity planning. The deployment reduces stock‑out and overstock costs, improves turnover and fulfillment rates, and provides a quantifiable, reproducible, edge‑deployable AI solution for cost reduction and efficiency gains.

Audience takeaways: Attendees learn (1) industrial solutions for multi‑channel, multimodal, long‑horizon forecasting that can be directly applied to logistics, retail, and warehousing; (2) how to implement dual‑dimensional Top‑K sparse attention, multimodal navigation, and lightweight MLP backbones to achieve SOTA accuracy with a 0.5 MB model on edge devices; (3) a complete pipeline from research innovation to engineering implementation and measurable business impact.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Supply Chaintime series forecastingedge deploymentmultimodal attentionLi-Netlightweight AI model
DataFunSummit
Written by

DataFunSummit

Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.