HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Mention Memory: incorporating textual knowledge into Transformers through entity mention attention

Michiel de Jong Yury Zemlyanskiy Nicholas FitzGerald Fei Sha William Cohen

Mention Memory: incorporating textual knowledge into Transformers through entity mention attention

Abstract

Natural language understanding tasks such as open-domain question answering often require retrieving and assimilating factual information from multiple sources. We propose to address this problem by integrating a semi-parametric representation of a large text corpus into a Transformer model as a source of factual knowledge. Specifically, our method represents knowledge with `mention memory', a table of dense vector representations of every entity mention in a corpus. The proposed model - TOME - is a Transformer that accesses the information through internal memory layers in which each entity mention in the input passage attends to the mention memory. This approach enables synthesis of and reasoning over many disparate sources of information within a single Transformer model. In experiments using a memory of 150 million Wikipedia mentions, TOME achieves strong performance on several open-domain knowledge-intensive tasks, including the claim verification benchmarks HoVer and FEVER and several entity-based QA benchmarks. We also show that the model learns to attend to informative mentions without any direct supervision. Finally we demonstrate that the model can generalize to new unseen entities by updating the memory without retraining.

Benchmarks

BenchmarkMethodologyMetrics
passage-retrieval-on-entityquestionsTOME-2
Recall@20: 0.838
question-answering-on-complexwebquestionsTOME-2
EM: 47.7
question-answering-on-triviaqaTOME-2
EM: 65.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp