HyperAI Super Neural
HyperAI Super Neural
Feb 3, 2026 · Artificial Intelligence

Walrus: 1.3B Transformer Model Beats Prior Foundations Across 19 Physics Domains

Walrus, a 1.3 billion‑parameter Transformer built by Polymathic AI, is pretrained on 19 diverse physics scenarios—including astrophysics, geoscience, rheology, plasma physics and acoustics—using techniques like patch jittering, adaptive compute tokenization and space‑time factorized attention, and consistently outperforms earlier foundation models on both short‑ and long‑term continuum dynamics predictions.

TransformerWalruscontinuum dynamics
0 likes · 13 min read
Walrus: 1.3B Transformer Model Beats Prior Foundations Across 19 Physics Domains
PaperAgent
PaperAgent
Jan 17, 2026 · Artificial Intelligence

Hypergraphs Turn LLMs into Reliable Material Discovery Agents

This article explains how representing multi‑component scientific knowledge as hyperedges, rather than traditional triples, enables large language models to traverse complex material interactions, reduce hallucinations, and generate verifiable experimental designs, demonstrated through a large hypergraph built from thousands of scaffold papers.

AI reasoningHypergraphLLM
0 likes · 7 min read
Hypergraphs Turn LLMs into Reliable Material Discovery Agents