Domestic World Model Claims Dual Crown, Surpassing Google and Nvidia via MoE Scaling

Manifold AI's WorldScape 0.2 topped the WorldArena benchmark by excelling in visual quality, physics compliance, and 3D accuracy, while using only 10% of the parameters of competing models, thanks to a newly introduced MoE architecture that drives a new scaling law for world models.

Machine Heart
Machine Heart
Machine Heart
Domestic World Model Claims Dual Crown, Surpassing Google and Nvidia via MoE Scaling

WorldArena, the first unified "function + vision" benchmark for embodied world models, is jointly organized by top institutions such as Tsinghua, Peking, Princeton, and others, and evaluates models on visual quality, motion quality, content consistency, physical compliance, 3D accuracy, and controllability, as well as downstream tasks like Data Engine, Policy Evaluator, and Action Planner, culminating in the composite EWMScore.

Manifold AI's WorldScape 0.2 achieved the global #1 rank on WorldArena, demonstrating a balanced superiority across all dimensions without any noticeable weakness, thereby supporting complex long‑horizon embodied tasks.

Overall perception score leading the chart: WorldScape 0.2 secured the top comprehensive perception score, showing perfect balance among visual, motion, consistency, and controllability metrics.

Physical compliance first: The model attained the highest score for adhering to real‑world physics, internalizing gravity, friction, collisions, and force feedback, enabling reliable physical simulation for robotics.

Outstanding 3D spatial understanding: It maintained high 3D accuracy in tasks such as robotic arm manipulation, viewpoint changes, and occlusion handling, avoiding common spatial distortion seen in video models.

One month earlier, WorldScape 0.1 had already topped the WorldScore benchmark (both static and dynamic tracks), confirming its consistent leadership.

The breakthrough is attributed to the introduction of a Mixture‑of‑Experts (MoE) architecture. MoE, proven in large language models, activates a small subset of specialized sub‑networks per input, allowing massive parameter scaling with modest compute cost.

Applying MoE to world models is crucial because it lets different experts handle visual dynamics, interaction, and physical reasoning separately, then fuse their outputs via a gating mechanism, preserving overall scalability while preventing interference between heterogeneous knowledge domains.

WorldScape 0.2’s MoE‑driven advances are manifested in three key aspects:

Multi‑expert collaborative generalization: The architecture expands from single‑task to a unified framework that jointly learns multiple control signals, enabling fine‑grained robotic manipulation and other embodied behaviors within a plug‑and‑play, extensible base.

Unified spatial representation: It aligns geometric constraints, semantic understanding, and physical laws in a shared implicit latent space, ensuring stable topology and coherent semantics during long‑range interactions.

Multi‑stage continual learning: A progressive training regime injects massive world knowledge and tightly couples heterogeneous control signals, shifting the model from merely “visually realistic” to “physically trustworthy”.

Despite using only about 10% of the parameter count of other top models, WorldScape 0.2 delivers the highest spatial intelligence density and real‑time inference, offering strong support for edge‑side physical AI deployment.

Manifold AI’s dominant performance across multiple mainstream benchmarks and its demonstrated scaling capability suggest that a GPT‑3‑like era for world models may soon arrive.

Mixture of Expertsembodied AIScaling LawWorld ModelsWorldArenaManifold AI
Machine Heart
Written by

Machine Heart

Professional AI media and industry service platform

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.