HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Data-to-text Generation with Variational Sequential Planning

Ratish Puduppully Yao Fu Mirella Lapata

Data-to-text Generation with Variational Sequential Planning

Abstract

We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input. We focus on generating long-form text, i.e., documents with multiple paragraphs, and propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way. We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Experiments on two data-to-text benchmarks (RotoWire and MLB) show that our model outperforms strong baselines and is sample efficient in the face of limited training data (e.g., a few hundred instances).

Code Repositories

ratishsp/data2text-seq-plan-py
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
data-to-text-generation-on-mlb-datasetSeqPlan
Precision: 95.9
count: 28.9
data-to-text-generation-on-mlb-dataset-1SeqPlan
Precision: 43.3
Recall: 53.5
data-to-text-generation-on-mlb-dataset-2SeqPlan
BLEU: 14.29
data-to-text-generation-on-mlb-dataset-3SeqPlan
DLD: 22.7
data-to-text-generation-on-rotowire-relationSeqPlan
Precision: 97.6
count: 46.7

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp