HyperAI
HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Zero-Shot Learning
Zero Shot Learning On Medconceptsqa
Zero Shot Learning On Medconceptsqa
Metrics
Accuracy
Results
Performance results of various models on this benchmark
Columns
Model Name
Accuracy
Paper Title
Repository
gpt-4-0125-preview
52.489
GPT-4 Technical Report
-
yikuan8/Clinical-Longformer
25.040
Clinical-Longformer and Clinical-BigBird: Transformers for long clinical sequences
-
PharMolix/BioMedGPT-LM-7B
24.747
BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine
-
UFNLP/gatortron-medium
24.862
GatorTron: A Large Clinical Language Model to Unlock Patient Information from Unstructured Electronic Health Records
-
HuggingFaceH4/zephyr-7b-beta
25.538
Zephyr: Direct Distillation of LM Alignment
-
dmis-lab/meerkat-7b-v1.0
25.680
Small Language Models Learn Enhanced Reasoning Skills from Medical Textbooks
-
johnsnowlabs/JSL-MedMNX-7B
24.427
-
-
meta-llama/Meta-Llama-3-8B-Instruct
25.840
LLaMA: Open and Efficient Foundation Language Models
-
BioMistral/BioMistral-7B-DARE
24.569
BioMistral: A Collection of Open-Source Pretrained Large Language Models for Medical Domains
-
epfl-llm/meditron-70b
25.360
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
-
gpt-3.5-turbo
37.058
Language Models are Few-Shot Learners
-
epfl-llm/meditron-7b
25.751
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
-
dmis-lab/biobert-v1.1
26.151
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
-
0 of 13 row(s) selected.
Previous
Next
Zero Shot Learning On Medconceptsqa | SOTA | HyperAI