HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Named Entity Recognition Ner
Named Entity Recognition Ner On Ontonotes V5
Named Entity Recognition Ner On Ontonotes V5
评估指标
F1
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
F1
Paper Title
Repository
Syn-LSTM + BERT (wo doc-context)
90.85
Better Feature Integration for Named Entity Recognition
DGLSTM-CRF + ELMo
89.88
Dependency-Guided LSTM-CRF for Named Entity Recognition
BiLSTM-LAN
88.16
Hierarchically-Refined Label Attention Network for Sequence Labeling
Bi-LSTM-CRF + Lexical Features
87.95
Robust Lexical Features for Improved Neural Network Named-Entity Recognition
PL-Marker
91.9
Packed Levitated Marker for Entity and Relation Extraction
Syn-LSTM (wo doc-context)
89.04
Better Feature Integration for Named Entity Recognition
BERT-MRC
91.11
A Unified MRC Framework for Named Entity Recognition
Hierarchical
87.98
Hierarchical Contextualized Representation for Named Entity Recognition
BiLSTM-CRF
86.99
Fast and Accurate Entity Recognition with Iterated Dilated Convolutions
DGLSTM-CRF (L=2)
88.52
Dependency-Guided LSTM-CRF for Named Entity Recognition
Joint Model
84.04
A Joint Model for Entity Analysis: Coreference, Typing, and Linking
-
Biaffine-NER
91.3
Named Entity Recognition as Dependency Parsing
CVT + Multi-Task + Large
88.81
Semi-Supervised Sequence Modeling with Cross-View Training
HSCRF + softdict
89.94
Towards Improving Neural Named Entity Recognition with Gazetteers
Att-BiLSTM-CNN
88.4
Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER
BARTNER
90.38
A Unified Generative Framework for Various NER Subtasks
NuNER
89.1
NuNER: Entity Recognition Encoder Pre-training via LLM-Annotated Data
W2NER
90.50
Unified Named Entity Recognition as Word-Word Relation Classification
AESINER
90.32
Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information
GRN
87.67
GRN: Gated Relation Network to Enhance Convolutional Neural Network for Named Entity Recognition
0 of 28 row(s) selected.
Previous
Next