DataFunTalk
Apr 10, 2020 · Artificial Intelligence
Improving Machine Translation: Addressing Exposure Bias, Efficient Decoding, and Non‑Autoregressive Models
This article reviews recent research on machine translation that tackles the training‑inference distribution gap, exposure bias, and slow autoregressive decoding by introducing scheduled sampling, differentiable sequence‑level losses, cube‑pruning, and sequence‑aware non‑autoregressive decoding, showing BLEU gains and significant speedups.
BLEUNLPcube pruning
0 likes · 16 min read