HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees

{Arzoo Katiyar Claire Cardie}

Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees

Abstract

We present a novel attention-based recurrent neural network for joint extraction of entity mentions and relations. We show that attention along with long short term memory (LSTM) network can extract semantic relations between entity mentions without having access to dependency trees. Experiments on Automatic Content Extraction (ACE) corpora show that our model significantly outperforms feature-based joint model by Li and Ji (2014). We also compare our model with an end-to-end tree-based LSTM model (SPTree) by Miwa and Bansal (2016) and show that our model performs within 1{%} on entity mentions and 2{%} on relations. Our fine-grained analysis also shows that our model performs significantly better on Agent-Artifact relations, while SPTree performs better on Physical and Part-Whole relations.

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-ace-2004Attention
Cross Sentence: No
NER Micro F1: 79.6
RE+ Micro F1: 45.7
relation-extraction-on-ace-2005Attention
Cross Sentence: No
NER Micro F1: 82.6
RE Micro F1: 55.9
RE+ Micro F1: 53.6
Sentence Encoder: biLSTM

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees | Papers | HyperAI