HyperAI
HyperAI超神经
首页
算力平台
文档
资讯
论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
Command Palette
Search for a command to run...
首页
SOTA
关系提取
Relation Extraction On Chemprot
Relation Extraction On Chemprot
评估指标
Micro F1
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Micro F1
Paper Title
Repository
CharacterBERT (base, medical)
73.44
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
BioM-BERT
-
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
-
SciBert (Finetune)
-
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT (Base Vocab)
-
SciBERT: A Pretrained Language Model for Scientific Text
ELECTRAMed
-
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
PubMedBERT uncased
77.24
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
SciFive Large
-
SciFive: a text-to-text transformer model for biomedical literature
BioLinkBERT (large)
79.98
LinkBERT: Pretraining Language Models with Document Links
KeBioLM
-
Improving Biomedical Pretrained Language Models with Knowledge
BioMegatron
-
BioMegatron: Larger Biomedical Domain Language Model
BioT5X (base)
-
SciFive: a text-to-text transformer model for biomedical literature
BioBERT
-
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
NCBI_BERT(large) (P)
-
Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets
0 of 13 row(s) selected.
Previous
Next
Relation Extraction On Chemprot | SOTA | HyperAI超神经