Machine Translation
Machine translation is an important task in natural language processing, aiming to convert sentences from a source language into equivalent expressions in a target language. In recent years, neural network models based on the encoder-decoder attention mechanism, such as BERT, have made significant progress, greatly improving translation quality. Common evaluation metrics include BLEU, METEOR, and NIST, while the WMT series datasets are important resources widely used for benchmark testing.
tensorflow/tensor2tensor
Multilingual Transformer
HWTSC-Teacher-Sim
T5
Larger
Transformer-base
GenTranslate-7B
SeamlessM4T Large
GPT-4o (HPT)
PiNMT
BP-Transformer
PS-KD
EnViT5 + MTet
PS-KD
Seq-KD + Seq-Inter + Word-KD
Transformer base + BPE-Dropout
Transformer base + BPE-Dropout
NLLB-200
Adaptively Sparse Transformer (alpha-entmax)
ChatGPT
slone/mbart-large-51-myv-mul-v1
PENELOPIE Transformers-based NMT (EN2EL)
M_C
DynamicConv
Multi-pass backtranslated adapted transformer
Transformer trained on highly filtered data
Multi-pass backtranslated adapted transformer
Vega-MT
Vega-MT
Evolved Transformer Big
Transformer+BT (ADMIN init)
Transformer Cycle (Rev)
Bi-SimCut
C2-50k Segmentation
ByteNet
Attentional encoder-decoder + BPE
Attentional encoder-decoder + BPE
DeLighT
DeLighT
fast-noisy-channel-modeling
StrokeNet
OmniNetP
OmniNetP
OmniNetP
OmniNetP
Facebook FAIR (ensemble)