Unsupervised Kg To Text Generation On Genwiki 1
Metrics
BLEU
CIDEr
METEOR
ROUGE-L
Results
Performance results of various models on this benchmark
Model Name | BLEU | CIDEr | METEOR | ROUGE-L | Paper Title | Repository |
---|---|---|---|---|---|---|
CycleGT_Warm | 40.47 | 3.48 | 34.84 | 63.40 | - | - |
CycleGT_Base | 41.29 | 3.53 | 35.39 | 63.73 | - | - |
DirectTransfer | 13.89 | 1.26 | 25.76 | 39.75 | - | - |
Rule-Based | 13.45 | 1.26 | 30.72 | 40.93 | - | - |
NoisySupervised | 35.03 | 2.63 | 33.45 | 58.14 | - | - |
0 of 5 row(s) selected.