Machine Heart
Machine Heart
Apr 4, 2026 · Artificial Intelligence

Does Scale Stealthily Hijack Attention? PMDformer’s Simple Subtraction Fix for Long-Term Forecasting

The paper identifies scale differences between patches as a hidden source of attention distortion in long‑term time‑series forecasting, introduces PMDformer with Patch Mean Decoupling, Neighbor Variable Attention, and Trend Recovery Attention, and demonstrates state‑of‑the‑art accuracy and efficiency across eight benchmark datasets.

Attention MechanismICLR2026Long-term Time Series Forecasting
0 likes · 8 min read
Does Scale Stealthily Hijack Attention? PMDformer’s Simple Subtraction Fix for Long-Term Forecasting
Bighead's Algorithm Notes
Bighead's Algorithm Notes
Oct 21, 2025 · Artificial Intelligence

KANMixer: A New KAN‑Centric Paradigm for Long‑Term Time Series Forecasting

This article reviews the KANMixer model, which places Kolmogorov‑Arnold Networks at the core of a lightweight architecture for long‑term time series forecasting, detailing its design, extensive benchmark experiments on seven real‑world datasets, ablation analyses, and its computational trade‑offs versus MLP and Transformer baselines.

Ablation StudyKANLong-term Time Series Forecasting
0 likes · 8 min read
KANMixer: A New KAN‑Centric Paradigm for Long‑Term Time Series Forecasting