Named Entity Recognition Ner On Bc5Cdr
Metrics
F1
Results
Performance results of various models on this benchmark
Model Name | F1 | Paper Title | Repository |
---|---|---|---|
GoLLIE | 88.4 | - | - |
BINDER | 91.9 | - | - |
aimped | 90.95 | - | - |
BLSTM-CNN-Char (SparkNLP) | 89.73 | - | - |
SciBERT (SciVocab) | 88.94 | - | - |
RDANER | 87.38 | - | - |
Spark NLP | 89.73 | - | - |
SciBERT (Base Vocab) | 88.11 | - | - |
CollaboNet | 87.12 | - | - |
CL-L2 | 90.99 | - | - |
BertForTokenClassification (Spark NLP) | 90.89 | - | - |
ELECTRAMed | 90.03 | - | - |
BERT-CRF | 86 | - | - |
ConNER | 91.3 | - | - |
BioLinkBERT (large) | 90.22 | - | - |
UniNER-7B | 89.34 | - | - |
0 of 16 row(s) selected.