Named Entity Recognition Ner On Ontonotes V5
Metrics
F1
Results
Performance results of various models on this benchmark
Model Name | F1 | Paper Title | Repository |
---|---|---|---|
Syn-LSTM + BERT (wo doc-context) | 90.85 | - | - |
DGLSTM-CRF + ELMo | 89.88 | - | - |
BiLSTM-LAN | 88.16 | - | - |
Bi-LSTM-CRF + Lexical Features | 87.95 | - | - |
PL-Marker | 91.9 | - | - |
Syn-LSTM (wo doc-context) | 89.04 | - | - |
BERT-MRC | 91.11 | - | - |
Hierarchical | 87.98 | - | - |
BiLSTM-CRF | 86.99 | - | - |
DGLSTM-CRF (L=2) | 88.52 | - | - |
Joint Model | 84.04 | - | - |
Biaffine-NER | 91.3 | - | - |
CVT + Multi-Task + Large | 88.81 | - | - |
HSCRF + softdict | 89.94 | - | - |
Att-BiLSTM-CNN | 88.4 | - | - |
BARTNER | 90.38 | - | - |
NuNER | 89.1 | - | - |
W2NER | 90.50 | - | - |
AESINER | 90.32 | - | - |
GRN | 87.67 | - | - |
0 of 28 row(s) selected.