Tag

BLEU

1 views collected around this technical thread.

NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
Mar 31, 2023 · Artificial Intelligence

Comparative Evaluation of Deepl and ChatGPT Machine Translation for Game Localization

This article investigates the translation quality of Deepl and ChatGPT for the game 'Naraka: Bladepoint' by comparing their outputs against professional human translations across Chinese‑English, Chinese‑Spanish, and English‑Spanish pairs using BLEU scores and manual assessment, revealing strengths and limitations of each system.

AIGCBLEUChatGPT
0 likes · 12 min read
Comparative Evaluation of Deepl and ChatGPT Machine Translation for Game Localization
Tencent Tech
Tencent Tech
Jul 22, 2021 · Artificial Intelligence

How Tencent Dominated WMT2021: Winning Five News‑Track Translation Tasks

Tencent’s machine‑translation teams clinched five first‑place wins in the WMT2021 news track—covering Chinese‑English, Japanese‑English and English‑German limited‑resource tasks—outperforming 82 competing teams and showcasing the impact of its AI‑driven translation engine across its products.

AI competitionBLEUTencent
0 likes · 4 min read
How Tencent Dominated WMT2021: Winning Five News‑Track Translation Tasks
New Oriental Technology
New Oriental Technology
Feb 1, 2021 · Artificial Intelligence

Neural Machine Translation: Seq2Seq, Beam Search, BLEU, Attention Mechanisms, and GNMT Improvements

This article explains key concepts of neural machine translation, covering Seq2Seq encoder‑decoder models, beam search strategies, BLEU evaluation, various attention mechanisms, and the enhancements introduced in Google's Neural Machine Translation system to improve speed, OOV handling, and translation quality.

AttentionBLEUGNMT
0 likes · 11 min read
Neural Machine Translation: Seq2Seq, Beam Search, BLEU, Attention Mechanisms, and GNMT Improvements
DataFunTalk
DataFunTalk
Jan 10, 2021 · Artificial Intelligence

Didi's Machine Translation System: Architecture, Techniques, and WMT2020 Competition Experience

This article presents a comprehensive overview of Didi's machine translation platform, covering its evolution from statistical to neural models, the Transformer architecture with relative position and larger FFN, data preparation, training strategies such as back‑translation and knowledge distillation, deployment optimizations with TensorRT, and the team's successful participation in the WMT2020 news translation task.

BLEUNeural NetworksTensorRT
0 likes · 14 min read
Didi's Machine Translation System: Architecture, Techniques, and WMT2020 Competition Experience
Didi Tech
Didi Tech
Oct 27, 2020 · Artificial Intelligence

Didi's Machine Translation System: Architecture, Techniques, and WMT2020 Competition Experience

Didi's machine translation system combines a Transformer‑big architecture with relative position representations, enlarged feed‑forward networks, iterative back‑translation, knowledge‑distillation and domain fine‑tuning, optimized via TensorRT for speed, achieving a BLEU 36.6 and third place in the WMT2020 Chinese‑to‑English news task.

BLEUNeural NetworksTensorRT
0 likes · 15 min read
Didi's Machine Translation System: Architecture, Techniques, and WMT2020 Competition Experience
Didi Tech
Didi Tech
Aug 23, 2020 · Artificial Intelligence

DiDi AI Labs Achieves Third Place in WMT2020 News Translation Task

DiDi AI Labs’ NLP team earned third place in the WMT2020 Chinese‑to‑English news translation task with a 36.6 BLEU score, using an enhanced Transformer‑2 model that incorporates self‑attention, relative positional attention, iterative back‑translation, knowledge distillation, data cleaning, ensembling, and other techniques, now deployed across DiDi’s international services.

BLEUDiDi AI LabsNLP
0 likes · 5 min read
DiDi AI Labs Achieves Third Place in WMT2020 News Translation Task
DataFunTalk
DataFunTalk
Apr 10, 2020 · Artificial Intelligence

Improving Machine Translation: Addressing Exposure Bias, Efficient Decoding, and Non‑Autoregressive Models

This article reviews recent research on machine translation that tackles the training‑inference distribution gap, exposure bias, and slow autoregressive decoding by introducing scheduled sampling, differentiable sequence‑level losses, cube‑pruning, and sequence‑aware non‑autoregressive decoding, showing BLEU gains and significant speedups.

BLEUNLPcube pruning
0 likes · 16 min read
Improving Machine Translation: Addressing Exposure Bias, Efficient Decoding, and Non‑Autoregressive Models
58 Tech
58 Tech
Mar 13, 2019 · Artificial Intelligence

Design and Implementation of the 58.com Intelligent Article Writing Robot

The article describes the design, workflow, and two‑stage model improvements of 58.com’s intelligent writing robot, which uses template matching, seq2seq with attention and BeamSearch, and slot‑replacement techniques to automatically generate titles and body content for real‑estate and used‑car promotions, achieving high publishing volume and readership.

AI writingBLEUBeamSearch
0 likes · 9 min read
Design and Implementation of the 58.com Intelligent Article Writing Robot