HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

ITER: Iterative Transformer-based Entity Recognition and Relation Extraction

{Michaela Geierhos Florian Babl Moritz Hennen}

ITER: Iterative Transformer-based Entity Recognition and Relation Extraction

Abstract

When extracting structured information from text, recognizing entities and extracting relationships are essential. Recent advances in both tasks generate a structured representation of the information in an autoregressive manner, a time-consuming and computationally expensive approach. This naturally raises the question of whether autoregressive methods are necessary in order to achieve comparable results. In this work, we propose ITER, an efficient encoder-based relation extraction model, that performs the task in three parallelizable steps, greatly accelerating a recent language modeling approach: ITER achieves an inference throughput of over 600 samples per second for a large model on a single consumer-grade GPU. Furthermore, we achieve state-of-the-art results on the relation extraction datasets ADE and ACE05, and demonstrate competitive performance for both named entity recognition with GENIA and CoNLL03, and for relation extraction with SciERC and CoNLL04.

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-ace-2005ITER
Cross Sentence: Yes
NER Micro F1: 91.6 ± 0.12
RE Micro F1: 75.1 ± 0.49
RE+ Micro F1: 71.9 ± 0.56
Sentence Encoder: FLAN T5 3B
relation-extraction-on-ade-corpusITER
NER Macro F1: 92.63 ± 0.89
RE+ Macro F1: 85.6 ± 1.42

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
ITER: Iterative Transformer-based Entity Recognition and Relation Extraction | Papers | HyperAI