HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Continual Pretraining
Continual Pretraining On Scierc
Continual Pretraining On Scierc
Metrics
F1 (macro)
Results
Performance results of various models on this benchmark
Comparison Table
Model Name
F1 (macro)
continual-learning-of-language-models
0.7093