HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Copy mechanism and tailored training for character-based data-to-text generation

Marco Roberti; Giovanni Bonetta; Rossella Cancelliere; Patrick Gallinari

Copy mechanism and tailored training for character-based data-to-text generation

Abstract

In the last few years, many different methods have been focusing on using deep recurrent neural networks for natural language generation. The most widely used sequence-to-sequence neural methods are word-based: as such, they need a pre-processing step called delexicalization (conversely, relexicalization) to deal with uncommon or unknown words. These forms of processing, however, give rise to models that depend on the vocabulary used and are not completely neural. In this work, we present an end-to-end sequence-to-sequence model with attention mechanism which reads and generates at a character level, no longer requiring delexicalization, tokenization, nor even lowercasing. Moreover, since characters constitute the common "building blocks" of every text, it also allows a more general approach to text generation, enabling the possibility to exploit transfer learning for training. These skills are obtained thanks to two major features: (i) the possibility to alternate between the standard generation mechanism and a copy one, which allows to directly copy input facts to produce outputs, and (ii) the use of an original training pipeline that further improves the quality of the generated texts. We also introduce a new dataset called E2E+, designed to highlight the copying capabilities of character-based models, that is a modified version of the well-known E2E dataset used in the E2E Challenge. We tested our model according to five broadly accepted metrics (including the widely used BLEU), showing that it yields competitive performance with respect to both character-based and word-based approaches.

Code Repositories

marco-roberti/char-data-to-text-gen
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
data-to-text-generation-on-e2e-nlg-challengeEDA_CS
BLEU: 67.05
CIDEr: 2.2355
METEOR: 44.49
NIST: 8.5150
ROUGE-L: 68.94
data-to-text-generation-on-e2e-nlg-challengeEDA_CS (TL)
BLEU: 65.80
CIDEr: 2.1803
METEOR: 45.16
NIST: 8.5615
ROUGE-L: 67.40

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp