HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints

Zhenyi Wang Xiaoyang Wang Bang An Dong Yu Changyou Chen

Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints

Abstract

Text generation from a knowledge base aims to translate knowledge triples to natural language descriptions. Most existing methods ignore the faithfulness between a generated text description and the original table, leading to generated information that goes beyond the content of the table. In this paper, for the first time, we propose a novel Transformer-based generation framework to achieve the goal. The core techniques in our method to enforce faithfulness include a new table-text optimal-transport matching loss and a table-text embedding similarity loss based on the Transformer model. Furthermore, to evaluate faithfulness, we propose a new automatic metric specialized to the table-to-text generation problem. We also provide detailed analysis on each component of our model in our experiments. Automatic and human evaluations show that our framework can significantly outperform state-of-the-art by a large margin.

Benchmarks

BenchmarkMethodologyMetrics
data-to-text-generation-on-wikipedia-personOurs
BLEU: 24.56

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints | Papers | HyperAI