AI Agent Research Hub
AI Agent Research Hub
Apr 1, 2026 · Artificial Intelligence

Scale‑PINN Solves High‑Re Navier‑Stokes in 100 seconds, Cutting Error by 96 %

The tutorial introduces Scale‑PINN, which adds an evolutionary regularization term inspired by pseudo‑time stepping to the PINN loss, enabling a shared‑backbone network to solve the lid‑driven cavity Navier‑Stokes problem at Re = 7500 in about 100 seconds and reducing the relative velocity error by roughly 96 % compared with a standard PINN.

Evolutionary regularizationHigh Reynolds numberJAX
0 likes · 25 min read
Scale‑PINN Solves High‑Re Navier‑Stokes in 100 seconds, Cutting Error by 96 %
AI Agent Research Hub
AI Agent Research Hub
Mar 18, 2026 · Artificial Intelligence

Variable-Scaling PINN for 2D Navier‑Stokes: How Coordinate Rescaling Improves Stiff PDE Training

This tutorial explains how a simple coordinate scaling (VS‑PINN) reduces stiffness in physics‑informed neural networks, demonstrates its implementation in JAX for the 2D steady incompressible Navier‑Stokes cylinder‑flow benchmark, and shows that after 80 000 Adam iterations the relative errors drop to 2.10 % (u), 5.06 % (v) and 4.45 % (p).

JAXNavier-StokesPINN
0 likes · 24 min read
Variable-Scaling PINN for 2D Navier‑Stokes: How Coordinate Rescaling Improves Stiff PDE Training
AI Agent Research Hub
AI Agent Research Hub
Mar 14, 2026 · Artificial Intelligence

Adaptive-Weight NTK-PINN Solves High-Frequency Wave Equation Using JAX

This tutorial explains how the Neural Tangent Kernel (NTK) perspective reveals the loss‑balance problem in Physics‑Informed Neural Networks (PINNs), introduces an NTK‑based adaptive‑weight algorithm, provides a full JAX implementation for a 1‑D high‑frequency wave equation, and shows that input normalisation dramatically improves accuracy while only modestly increasing training time.

JAXNTKPINN
0 likes · 27 min read
Adaptive-Weight NTK-PINN Solves High-Frequency Wave Equation Using JAX
AIWalker
AIWalker
Jul 1, 2025 · Artificial Intelligence

How a Minor Tweak to PINN Achieves Up to 100× Speedup

Recent breakthroughs such as VS‑PINN, Stiff‑PINN, MAD‑Scientist and KAN‑ODEs demonstrate how small algorithmic changes and novel training strategies can accelerate physics‑informed neural networks by orders of magnitude while expanding their applicability to stiff PDEs, chemical kinetics and dynamical systems.

PINNPhysics-Informed Neural NetworksScientific Machine Learning
0 likes · 6 min read
How a Minor Tweak to PINN Achieves Up to 100× Speedup