Command Palette
Search for a command to run...
Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders
Jue Wang; Wei Lu

Abstract
Named entity recognition and relation extraction are two important fundamental problems. Joint learning algorithms have been proposed to solve both tasks simultaneously, and many of them cast the joint task as a table-filling problem. However, they typically focused on learning a single encoder (usually learning representation in the form of a table) to capture information required for both tasks within the same space. We argue that it can be beneficial to design two distinct encoders to capture such two different types of information in the learning process. In this work, we propose the novel {\em table-sequence encoders} where two different encoders -- a table encoder and a sequence encoder are designed to help each other in the representation learning process. Our experiments confirm the advantages of having {\em two} encoders over {\em one} encoder. On several standard datasets, our model shows significant improvements over existing approaches.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| relation-extraction-on-ace-2004 | Table-Sequence | Cross Sentence: No NER Micro F1: 88.6 RE Micro F1: 63.3 RE+ Micro F1: 59.6 |
| relation-extraction-on-ace-2005 | Table-Sequence | Cross Sentence: No NER Micro F1: 89.5 RE Micro F1: 67.6 RE+ Micro F1: 64.3 Sentence Encoder: ALBERT |
| relation-extraction-on-ade-corpus | Table-Sequence | NER Macro F1: 89.7 RE Macro F1: 80.1 RE+ Macro F1: 80.1 |
| relation-extraction-on-conll04 | Table-Sequence | NER Macro F1: 86.9 NER Micro F1: 90.1 RE+ Macro F1 : 75.4 RE+ Micro F1: 73.6 |
| zero-shot-relation-triplet-extraction-on | TableSequence | Avg. F1: 6.37 |
| zero-shot-relation-triplet-extraction-on-wiki | TableSequence | Avg. F1: 6.4 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.