HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Text Simplification by Tagging

Kostiantyn Omelianchuk Vipul Raheja Oleksandr Skurzhanskyi

Text Simplification by Tagging

Abstract

Edit-based approaches have recently shown promising results on multiple monolingual sequence transduction tasks. In contrast to conventional sequence-to-sequence (Seq2Seq) models, which learn to generate text from scratch as they are trained on parallel corpora, these methods have proven to be much more effective since they are able to learn to make fast and accurate transformations while leveraging powerful pre-trained language models. Inspired by these ideas, we present TST, a simple and efficient Text Simplification system based on sequence Tagging, leveraging pre-trained Transformer-based encoders. Our system makes simplistic data augmentations and tweaks in training and inference on a pre-existing system, which makes it less reliant on large amounts of parallel training data, provides more control over the outputs and enables faster inference speeds. Our best model achieves near state-of-the-art performance on benchmark test datasets for the task. Since it is fully non-autoregressive, it achieves faster inference speeds by over 11 times than the current state-of-the-art text simplification system.

Code Repositories

grammarly/gector
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
text-simplification-on-assetTST
SARI (EASSEu003e=0.2.1): 43.21
text-simplification-on-pwkp-wikismallTST
SARI: 44.67
SARI (EASSEu003e=0.2.1): 44.67
text-simplification-on-turkcorpusTST
SARI (EASSEu003e=0.2.1): 41.46

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp