HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Complexity-Weighted Loss and Diverse Reranking for Sentence Simplification

Reno Kriz; João Sedoc; Marianna Apidianaki; Carolina Zheng; Gaurav Kumar; Eleni Miltsakaki; Chris Callison-Burch

Complexity-Weighted Loss and Diverse Reranking for Sentence Simplification

Abstract

Sentence simplification is the task of rewriting texts so they are easier to understand. Recent research has applied sequence-to-sequence (Seq2Seq) models to this task, focusing largely on training-time improvements via reinforcement learning and memory augmentation. One of the main problems with applying generic Seq2Seq models for simplification is that these models tend to copy directly from the original sentence, resulting in outputs that are relatively long and complex. We aim to alleviate this issue through the use of two main techniques. First, we incorporate content word complexities, as predicted with a leveled word complexity model, into our loss function during training. Second, we generate a large set of diverse candidate simplifications at test time, and rerank these to promote fluency, adequacy, and simplicity. Here, we measure simplicity through a novel sentence complexity model. These extensions allow our models to perform competitively with state-of-the-art systems while generating simpler sentences. We report standard automatic and human evaluation metrics.

Code Repositories

rekriz11/DeDiv
Mentioned in GitHub
rekriz11/sockeye-recipes
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
text-simplification-on-newselaS2S-Cluster-FA
BLEU: 19.55
SARI: 30.73

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp