HyperAIHyperAI

Machine Translation On Wmt2016 English 1

Metrics

BLEU score

Results

Performance results of various models on this benchmark

Model Name
BLEU score
Paper TitleRepository
PBSMT + NMT25.13Phrase-Based & Neural Unsupervised Machine Translation-
Denoising autoencoders (non-autoregressive)29.66Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement-
ConvS2S BPE40k29.9Convolutional Sequence to Sequence Learning-
CMLM+LAT+4 iterations32.87Incorporating a Local Translation Mechanism into Non-autoregressive Translation-
NAT +FT + NPD29.79Non-Autoregressive Neural Machine Translation-
DeLighT34.7DeLighT: Deep and Light-weight Transformer-
GRU BPE90k28.9--
Unsupervised PBSMT21.33Phrase-Based & Neural Unsupervised Machine Translation-
CMLM+LAT+1 iterations30.74Incorporating a Local Translation Mechanism into Non-autoregressive Translation-
Deep Convolutional Encoder; single-layer decoder27.8A Convolutional Encoder Model for Neural Machine Translation-
FlowSeq-large (NPD n=15)31.97FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow-
FLAN 137B (few-shot, k=9)20.5Finetuned Language Models Are Zero-Shot Learners-
BiLSTM27.5A Convolutional Encoder Model for Neural Machine Translation-
BART (TextBox 2.0)-TextBox 2.0: A Text Generation Library with Pre-trained Language Models-
BiGRU28.1Edinburgh Neural Machine Translation Systems for WMT 16-
FlowSeq-base29.26FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow-
FlowSeq-large29.86FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow-
FLAN 137B (zero-shot)18.9Finetuned Language Models Are Zero-Shot Learners-
Unsupervised NMT + Transformer21.18Phrase-Based & Neural Unsupervised Machine Translation-
FlowSeq-large (NPD n = 30)32.35FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow-
0 of 21 row(s) selected.
Machine Translation On Wmt2016 English 1 | SOTA | HyperAI