HyperAI
HyperAI超神经
首页
算力平台
文档
资讯
论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
Command Palette
Search for a command to run...
首页
SOTA
文本摘要
Text Summarization On Pubmed 1
Text Summarization On Pubmed 1
评估指标
ROUGE-1
ROUGE-2
ROUGE-L
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
ROUGE-1
ROUGE-2
ROUGE-L
Paper Title
Repository
Top Down Transformer (AdaPool) (464M)
51.05
23.26
46.47
Long Document Summarization with Top-down and Bottom-up Inference
eyeglaxs
50.34
24.57
45.96
Scaling Up Summarization: Leveraging Large Language Models for Long Text Extractive Summarization
-
BART-LS
50.3
-
-
Adapting Pretrained Text-to-Text Models for Long Text Sequences
LongT5
50.23
24.76
46.67
LongT5: Efficient Text-To-Text Transformer for Long Sequences
GoSum (extractive)
49.83
23.56
45.10
GoSum: Extractive Summarization of Long Documents by Reinforcement Learning and Graph Organized discourse state
Lodoss-full-large (extractive)
49.38
23.89
44.84
Toward Unifying Text Segmentation and Long Document Summarization
MemSum (extractive)
49.25
22.94
44.42
MemSum: Extractive Summarization of Long Documents Using Multi-Step Episodic Markov Decision Processes
Lodoss-full-base (extractive)
48.93
23.51
44.40
Toward Unifying Text Segmentation and Long Document Summarization
HAT-BART
48.25
21.35
36.69
Hierarchical Learning for Generation with Long Source Sequences
-
GRETEL
48.20
21.20
43.16
GRETEL: Graph Contrastive Topic Enhanced Language Model for Long Document Extractive Summarization
-
DeepPyramidion
47.81
21.14
-
Sparsifying Transformer Models with Trainable Representation Pooling
FactorSum
47.5
20.33
43.76
Factorizing Content and Budget Decisions in Abstractive Summarization of Long Documents
HiStruct+
46.59
20.39
42.11
HiStruct+: Improving Extractive Text Summarization with Hierarchical Structure Information
-
DANCER PEGASUS
46.34
19.97
42.42
A Divide-and-Conquer Approach to the Summarization of Long Documents
BigBird-Pegasus
46.32
20.65
42.33
Big Bird: Transformers for Longer Sequences
ExtSum-LG+MMR-Select+
45.39
20.37
40.99
Systematically Exploring Redundancy Reduction in Summarizing Long Documents
ExtSum-LG+RdLoss
45.3
20.42
40.95
Systematically Exploring Redundancy Reduction in Summarizing Long Documents
PEGASUS
45.09
-
-
PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
Sent-CLF
45.01
-
-
On Extractive and Abstractive Neural Document Summarization with Transformer Language Models
ExtSum-LG
44.81
19.74
-
Extractive Summarization of Long Documents by Combining Global and Local Context
0 of 29 row(s) selected.
Previous
Next
Text Summarization On Pubmed 1 | SOTA | HyperAI超神经