HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

Jun Suzuki; Masaaki Nagata

Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

Abstract

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

Benchmarks

BenchmarkMethodologyMetrics
text-summarization-on-duc-2004-task-1EndDec+WFE
ROUGE-1: 32.28
ROUGE-2: 10.54
ROUGE-L: 27.8
text-summarization-on-gigawordEndDec+WFE
ROUGE-1: 36.30
ROUGE-2: 17.31
ROUGE-L: 33.88

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization | Papers | HyperAI