Command Palette
Search for a command to run...
End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures
Makoto Miwa; Mohit Bansal

Abstract
We present a novel end-to-end neural model to extract entities and relations between them. Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional tree-structured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allows our model to jointly represent both entities and relations with shared parameters in a single model. We further encourage detection of entities during training and use of entity information in relation extraction via entity pretraining and scheduled sampling. Our model improves over the state-of-the-art feature-based model on end-to-end relation extraction, achieving 12.1% and 5.7% relative error reductions in F1-score on ACE2005 and ACE2004, respectively. We also show that our LSTM-RNN based model compares favorably to the state-of-the-art CNN based model (in F1-score) on nominal relation classification (SemEval-2010 Task 8). Finally, we present an extensive ablation analysis of several model components.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| relation-extraction-on-ace-2004 | SPTree | Cross Sentence: No NER Micro F1: 81.8 RE+ Micro F1: 48.4 |
| relation-extraction-on-ace-2005 | SPTree | Cross Sentence: No NER Micro F1: 83.4 RE+ Micro F1: 55.6 Sentence Encoder: biLSTM |
| relation-extraction-on-nyt11-hrl | SPTree | F1: 53.1 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.