HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Open Domain Question Answering
Open Domain Question Answering On Searchqa
Open Domain Question Answering On Searchqa
Metrics
F1
Results
Performance results of various models on this benchmark
Columns
Model Name
F1
Paper Title
Repository
SpanBERT
84.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Denoising QA
64.5
Denoising Distantly Supervised Open-Domain Question Answering
DecaProp
63.6
Densely Connected Attention Propagation for Reading Comprehension
DECAPROP
-
Densely Connected Attention Propagation for Reading Comprehension
Locality-Sensitive Hashing
-
Reformer: The Efficient Transformer
Sparse Attention
-
Generating Long Sequences with Sparse Transformers
Multi-passage BERT
-
Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering
-
Cluster-Former (#C=512)
-
Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding
-
Bi-Attention + DCU-LSTM
-
Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension
-
Focused Hierarchical RNN
-
Focused Hierarchical RNNs for Conditional Sequence Processing
-
AMANDA
-
A Question-Focused Multi-Factor Attention Network for Question Answering
ASR
-
Text Understanding with the Attention Sum Reader Network
R^3
55.3
R$^3$: Reinforced Reader-Ranker for Open-Domain Question Answering
DrQA
-
Reading Wikipedia to Answer Open-Domain Questions
0 of 14 row(s) selected.
Previous
Next