Lead–LagNet: Modeling Cross‑Series Lead‑Lag Dependencies for Time‑Series Forecasting

Lead–LagNet addresses three key limitations of existing graph neural networks for multivariate time‑series forecasting—loss of fine‑grained temporal detail, shared weight assumptions, and reduced interpretability—by introducing a sequence preprocessor with a global influence separator and subsequence detector, a subsequence dependency encoder, and a decoupled message‑passing mechanism, achieving superior performance on synthetic benchmarks and S&P 500 market data.

Bighead's Algorithm Notes
Bighead's Algorithm Notes
Bighead's Algorithm Notes
Lead–LagNet: Modeling Cross‑Series Lead‑Lag Dependencies for Time‑Series Forecasting

Background

Real‑world dynamic systems often exhibit lead‑lag effects across related time series, such as stocks whose price movements lead and influence others. Traditional GNN‑based approaches compress temporal information into discrete points and assume synchronized, uniform influence, which fails to capture these asynchronous, heterogeneous interactions. Moreover, stacking GNN layers for multi‑hop dependencies reduces interpretability.

Problem Definition

The goal is to model both horizontal (within‑series) temporal dynamics and vertical (across‑series) lead‑lag relationships in multivariate time‑series forecasting. Formally, for a system with N entities, the recent window X(t‑T:t] = {x_1, …, x_N} where each x_i contains D -dimensional features, the task is to predict the future target y_i = f_y(X(t‑T:t]; Θ) for each entity.

Method

The proposed Lead–LagNet framework consists of four main components.

3.1 Sequence Preprocessor

Global Influential Separator (GIS) : A bidirectional LSTM encodes each entity’s raw window v_i^{origin} and a global influence component v_i^{GI}. Subtracting the latter isolates entity‑specific dynamics.

Subsequence Detector : A boundary LSTM (boLSTM) introduces a boundary controller b_t^l that hierarchically segments the raw series into meaningful subsequences, producing time embeddings for each detected segment.

3.2 Subsequence Dependency Encoder (SDE)

Boundary indicators are element‑wise multiplied with corresponding subsequence embeddings and fed into a Transformer encoder (augmented with rotary position embeddings) to obtain enriched sequence embeddings that capture temporal offsets between subsequences.

3.3 Lead–LagNet Core

Messaging Gate Generation : Non‑linear transformations of a pair of entity embeddings produce a gating value that modulates the influence from entity j to entity i.

Message‑Passing Process Decoupling : The gated influences are summed and processed by a bidirectional LSTM, eliminating the need for predefined relational graphs.

3.4 Output Mapping Module

A single‑layer feed‑forward network maps the final aggregated representation to the future trend prediction, trained with mean‑squared‑error loss.

Experiments

4.1 Synthetic Tasks

A synthetic dataset is generated by sampling 100 numbers from U(-1, 1) and embedding six predefined subsequences. Three tasks evaluate subsequence counting, interval detection, and ordering recognition. Lead–LagNet’s sequential module improves accuracy by 4.26 %, 7.59 %, and 8.89 % over four baseline RNNs (BiLSTM, BiGRU, HM‑LSTM, D‑LSTM). Adding the GIS further restores performance under global interference.

4.2 Stock‑Market Empirical Study

Using S&P 500 data from bull, bear, and sideways markets, the model is evaluated with MSE, Mean Reciprocal Rank (MRR), and Annualized Excess Return (AER). Lead–LagNet consistently outperforms baselines (BiLSTM, BiGRU, DA‑RNN, SFM, TGC, FinGAT, ADGAT, SGRN) across all metrics. Replacing Lead–LagNet’s sequential module with a standard BiLSTM degrades performance, confirming the importance of learned subsequence dependencies.

Visualization Analysis

Message‑passing intensity visualizations reveal distinct lead‑lag patterns: in sideways markets the distribution is right‑skewed, indicating stronger risk propagation and more pronounced reactions to individual stock signals. Temporal aggregation shows persistent multi‑factor influences on stock prices, underscoring the value of capturing lead‑lag effects for market dynamics understanding.

time series forecastingSequence ModelingFinancial Market PredictionLead‑Lag DependencyLead–LagNet
Bighead's Algorithm Notes
Written by

Bighead's Algorithm Notes

Focused on AI applications in the fintech sector

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.