Command Palette
Search for a command to run...
Clément Rebuffel Laure Soulier Geoffrey Scoutheeten Patrick Gallinari

Abstract
Transcribing structured data into natural language descriptions has emerged as a challenging task, referred to as "data-to-text". These structures generally regroup multiple elements, as well as their attributes. Most attempts rely on translation encoder-decoder methods which linearize elements into a sequence. This however loses most of the structure contained in the data. In this work, we propose to overpass this limitation with a hierarchical model that encodes the data-structure at the element-level and the structure level. Evaluations on RotoWire show the effectiveness of our model w.r.t. qualitative and quantitative metrics.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| data-to-text-generation-on-rotowire | Hierarchical transformer encoder + conditional copy | BLEU: 17.50 |
| data-to-text-generation-on-rotowire-content | Hierarchical Transformer Encoder + conditional copy | BLEU: 17.50 DLD: 18.90% |
| data-to-text-generation-on-rotowire-content-1 | Hierarchical Transformer Encoder + conditional copy | Precision: 39.47% Recall: 51.64% |
| data-to-text-generation-on-rotowire-relation | Hierarchical Transformer Encoder + conditional copy | Precision: 89.46% count: 21.17 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.