How Information Shapes Koopman Representations for World Modeling

The paper revisits Koopman representation learning through the lens of the information bottleneck, identifies the exact information needed for stable, long‑term dynamics propagation, proposes an information‑shaped objective combining mutual information and von Neumann entropy, and demonstrates superior prediction accuracy and stability across physical, visual, and graph‑based dynamical systems.

Machine Heart
Machine Heart
Machine Heart
How Information Shapes Koopman Representations for World Modeling

The authors observe that most world‑model approaches assume that learning a good latent dynamics is sufficient, but this assumption is questionable because the latent must retain the right information to support predictable, propagatable dynamics.

Motivation

They ask: What information is necessary in a finite‑dimensional representation to enable stable dynamics propagation and long‑term prediction? To answer, they reinterpret Koopman representation learning using the information‑bottleneck (IB) framework.

Key Properties of a Good Koopman Representation

Temporal Coherence : the latent must evolve stably over time.

Structural Consistency : the latent evolution should respect the linear structure imposed by the Koopman operator.

Predictive Sufficiency : the representation must retain enough dynamical modes to support accurate long‑term forecasts.

Theoretical Analysis

1. Long‑term prediction error stems from accumulated information loss. Each Koopman step discards a fraction of predictive information; the sum of these losses yields the final error. Consequently, mutual information (MI) directly measures how much predictive capability the representation preserves.

2. Not all information is equally important. MI alone cannot distinguish which type of information is lost. The authors decompose the decoder‑relevant information into three parts (Fig. 2b):

Temporal‑coherent information – stable across time and aligned with Koopman modes.

Fast‑dissipating information – useful only for a short horizon.

Residual information – akin to noise or instantaneous injections.

The first class is crucial for long‑term prediction.

3. Maximising MI alone leads to mode collapse. Excessive MI drives information to concentrate on a few dominant modes, reducing the effective dimensionality of the latent space and causing spectral degeneration.

Information‑Shaped Objective

To balance the trade‑off, the paper introduces a Lagrangian that jointly optimises:

MI term – encourages preservation of temporal‑coherent information (Temporal Coherence).

Structural‑consistency term – enforces linear forward dynamics (Structural Consistency).

von Neumann entropy (VNE) term – penalises over‑concentration of information, preserving a diverse set of predictive modes (Predictive Sufficiency).

Reconstruction/ELBO term – stabilises training.

The resulting optimisation objective directly maps to the three Koopman properties (Fig. 3).

Experiments

The method is evaluated on three families of tasks:

Physical dynamical systems: Lorenz‑63, Kármán vortex, dam flow, ERA5 weather forecasting.

High‑dimensional visual control: planar, pendulum, cart‑pole.

Graph‑structured dynamics: rope and soft‑robotics simulations.

Across all benchmarks, the proposed approach achieves lower short‑ and long‑term prediction errors than multiple Koopman baselines (Fig. 4‑5). In the Kármán vortex task, the latent trajectories remain close to single‑step predictions and the learned eigenvalues cluster near the unit circle, indicating a more stable spectral structure (Fig. 6).

Conclusion and Outlook

By framing Koopman representation learning as an information‑theoretic optimisation problem, the work clarifies that the central challenge is not merely finding a linearising latent space but selecting the minimal set of information that guarantees both propagation stability and predictive power under finite‑dimensional constraints. The MI‑VNE formulation provides a principled way to avoid mode collapse while preserving essential dynamics, opening avenues for more robust world‑model construction.

representation learningMutual InformationKoopmanvon Neumann Entropy
Machine Heart
Written by

Machine Heart

Professional AI media and industry service platform

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.