Abstractive Text Summarization On Cnn Daily
Metrics
ROUGE-1
ROUGE-2
ROUGE-L
Results
Performance results of various models on this benchmark
Model Name | ROUGE-1 | ROUGE-2 | ROUGE-L | Paper Title | Repository |
---|---|---|---|---|---|
end2end w/ inconsistency loss | 40.68 | 17.97 | 37.13 | - | - |
Subformer-base | 40.9 | 18.3 | 37.7 | - | - |
Two-Stage + RL | 41.71 | 19.49 | 38.79 | - | - |
PEGASUS | 44.17 | 21.47 | 41.11 | - | - |
Selector & Pointer-Generator | 41.72 | 18.74 | 38.79 | - | - |
DELTA (BLSTM) | - | - | 27.3 | - | - |
PTGEN + Coverage | 39.53 | 17.28 | 36.38 | - | - |
SRformer-BART | 43.19 | 19.80 | 40.40 | - | - |
CoCoNet + CoCoPretrain | 44.50 | 21.55 | 41.24 | - | - |
Scrambled code + broken | 46.71 | 20.39 | 43.56 | - | - |
ERNIE-GENLARGE (large-scale text corpora) | 44.31 | 21.35 | 41.60 | - | - |
Transformer | 39.50 | 16.06 | 36.63 | - | - |
MUPPET BART Large | 44.45 | 21.25 | 41.4 | - | - |
RL + pg + cbdec | 40.66 | 17.87 | 37.06 | - | - |
Summary Loop Unsup | 37.7 | - | - | - | - |
GLM-XXLarge | 44.7 | 21.4 | 41.4 | - | - |
ERNIE-GENBASE | 42.30 | 19.92 | 39.68 | - | - |
BertSumExtAbs | 42.13 | 19.6 | 39.18 | - | - |
Li et al. | 40.30 | 18.02 | 37.36 | - | - |
PTGEN + Coverage | 39.53 | 17.28 | 36.38 | - | - |
0 of 53 row(s) selected.