HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

TENER: Adapting Transformer Encoder for Named Entity Recognition

Hang Yan Bocao Deng Xiaonan Li Xipeng Qiu

TENER: Adapting Transformer Encoder for Named Entity Recognition

Abstract

The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an encoder in models solving the named entity recognition (NER) task. Recently, the Transformer is broadly adopted in various Natural Language Processing (NLP) tasks owing to its parallelism and advantageous performance. Nevertheless, the performance of the Transformer in NER is not as good as it is in other NLP tasks. In this paper, we propose TENER, a NER architecture adopting adapted Transformer Encoder to model the character-level features and word-level features. By incorporating the direction and relative distance aware attention and the un-scaled attention, we prove the Transformer-like encoder is just as effective for NER as other NLP tasks.

Code Repositories

jaykay233/TF2.0-TENER
tf
Mentioned in GitHub
fastnlp/TENER
Official
pytorch
Mentioned in GitHub
GeremWD/dlnlp_project
pytorch
Mentioned in GitHub
HIT-SCIR/ltp
pytorch
Mentioned in GitHub
dhiraa/tener
tf
Mentioned in GitHub

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp