LangFlow Demonstrates Continuous Diffusion Matching Discrete Models via Better Training
LangFlow revisits continuous diffusion for language modeling, showing that earlier performance gaps were due to suboptimal training and evaluation, and through embedding‑space diffusion, a log‑NSR noise schedule, and a Gumbel‑based information schedule it matches or exceeds discrete diffusion and autoregressive baselines on standard and zero‑shot benchmarks.
