Machine Heart
Machine Heart
Mar 31, 2026 · Artificial Intelligence

ProMoE: Explicit Routing Breaks the Scaling Bottleneck of Diffusion‑Transformer MoE (ICLR 2026)

ProMoE introduces a two‑step routing MoE framework with explicit semantic guidance that tackles the high spatial redundancy and functional heterogeneity of visual tokens, enabling diffusion transformers to scale efficiently and outperform dense models and prior MoE approaches across generation, convergence, and scaling benchmarks.

Diffusion TransformerExplicit RoutingMixture of Experts
0 likes · 9 min read
ProMoE: Explicit Routing Breaks the Scaling Bottleneck of Diffusion‑Transformer MoE (ICLR 2026)
AIWalker
AIWalker
Mar 6, 2026 · Artificial Intelligence

VA‑π: Pixel‑Level Alignment Achieves 50% FID Reduction with 25‑Minute Fine‑Tuning

The paper introduces VA‑π, a lightweight post‑training framework that aligns pixel‑level reconstruction with autoregressive generation using variational inference and reinforcement learning, achieving up to 50% FID reduction after just 25 minutes of fine‑tuning on LlamaGen‑XXL.

AR ModelsPixel AlignmentVariational Inference
0 likes · 14 min read
VA‑π: Pixel‑Level Alignment Achieves 50% FID Reduction with 25‑Minute Fine‑Tuning