HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

End-to-end Neural Coreference Resolution

Kenton Lee; Luheng He; Mike Lewis; Luke Zettlemoyer

End-to-end Neural Coreference Resolution

Abstract

We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector. The key idea is to directly consider all spans in a document as potential mentions and learn distributions over possible antecedents for each. The model computes span embeddings that combine context-dependent boundary representations with a head-finding attention mechanism. It is trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions. Experiments demonstrate state-of-the-art performance, with a gain of 1.5 F1 on the OntoNotes benchmark and by 3.1 F1 using a 5-model ensemble, despite the fact that this is the first approach to be successfully trained with no external resources.

Code Repositories

Achint08/e2e-coref-keras
Mentioned in GitHub
kentonl/e2e-coref
Official
tf
Mentioned in GitHub
shayneobrien/coreference-resolution
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
coreference-resolution-on-conll-2012e2e-coref (single)
Avg F1: 67.2
coreference-resolution-on-conll-2012e2e-coref + ELMo
Avg F1: 70.4
coreference-resolution-on-conll-2012e2e-coref (ensemble)
Avg F1: 68.8
coreference-resolution-on-ontonotese2e-coref
F1: 67.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp