HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Boosting Entity Linking Performance by Leveraging Unlabeled Documents

Phong Le; Ivan Titov

Boosting Entity Linking Performance by Leveraging Unlabeled Documents

Abstract

Modern entity linking systems rely on large collections of documents specifically annotated for the task (e.g., AIDA CoNLL). In contrast, we propose an approach which exploits only naturally occurring information: unlabeled documents and Wikipedia. Our approach consists of two stages. First, we construct a high recall list of candidate entities for each mention in an unlabeled document. Second, we use the candidate lists as weak supervision to constrain our document-level entity linking model. The model treats entities as latent variables and, when estimated on a collection of unlabelled texts, learns to choose entities relying both on local context of each mention and on coherence with other entities in the document. The resulting approach rivals fully-supervised state-of-the-art systems on standard test sets. It also approaches their performance in the very challenging setting: when tested on a test set sampled from the data used to estimate the supervised systems. By comparing to Wikipedia-only training of our model, we demonstrate that modeling unlabeled documents is beneficial.

Code Repositories

lephong/wnel
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
entity-disambiguation-on-aida-conllLe& Titov (2019) (Le and Titov, 2019)
In-KB Accuracy: 89.66

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp