HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Strategies for Pre-training Graph Neural Networks

Weihua Hu; Bowen Liu; Joseph Gomes; Marinka Zitnik; Percy Liang; Vijay Pande; Jure Leskovec

Strategies for Pre-training Graph Neural Networks

Abstract

Many applications of machine learning require a model to make accurate pre-dictions on test examples that are distributionally different from training ones, while task-specific labels are scarce during training. An effective approach to this challenge is to pre-train a model on related tasks where data is abundant, and then fine-tune it on a downstream task of interest. While pre-training has been effective in many language and vision domains, it remains an open question how to effectively use pre-training on graph datasets. In this paper, we develop a new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs). The key to the success of our strategy is to pre-train an expressive GNN at the level of individual nodes as well as entire graphs so that the GNN can learn useful local and global representations simultaneously. We systematically study pre-training on multiple graph classification datasets. We find that naive strategies, which pre-train GNNs at the level of either entire graphs or individual nodes, give limited improvement and can even lead to negative transfer on many downstream tasks. In contrast, our strategy avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.

Code Repositories

microsoft/fs-mol
pytorch
Mentioned in GitHub
jacquesboitreaud/rna_ne
pytorch
Mentioned in GitHub
snap-stanford/pretrain-gnns/
Official
pytorch
Mentioned in GitHub
hld67890/prml2020_pj
pytorch
Mentioned in GitHub
maplightrx/maplight-tdc
Mentioned in GitHub
Wenlin-Chen/ADKF-IFT
pytorch
Mentioned in GitHub
jacquesboitreaud/DeepFRED
pytorch
Mentioned in GitHub
fransou/is-meta-learning-necessary
pytorch
Mentioned in GitHub
gnn4dr/DRKG
pytorch
Mentioned in GitHub
snap-stanford/pretrain-gnns
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
drug-discovery-on-baceContextPred
AUC: 0.845
drug-discovery-on-bbbpContextPred
AUC: 0.687
drug-discovery-on-clintoxContextPred
AUC: 0.726
drug-discovery-on-hiv-datasetContextPred
AUC: 0.799
drug-discovery-on-muvContextPred
AUC: 0.813
drug-discovery-on-siderContextPred
AUC: 0.627
drug-discovery-on-tox21ContextPred
AUC: 0.781
drug-discovery-on-toxcastContextPred
AUC: 0.657
molecular-property-prediction-onPretrainGNN
RMSE: 0.739
molecular-property-prediction-on-bace-1PretrainGNN
ROC-AUC: 84.5
molecular-property-prediction-on-bbbp-1PretrainGNN
ROC-AUC: 68.7
molecular-property-prediction-on-clintox-1PretrainGNN
ROC-AUC: 72.6
molecular-property-prediction-on-freesolvPretrainGNN
RMSE: 2.764
molecular-property-prediction-on-qm7PretrainGNN
MAE: 113.2
molecular-property-prediction-on-qm8PretrainGNN
MAE: 0.0200
molecular-property-prediction-on-qm9PretrainGNN
MAE: 0.00922
molecular-property-prediction-on-sider-1PretrainGNN
ROC-AUC: 62.7
molecular-property-prediction-on-tox21-1PretrainGNN
ROC-AUC: 78.1
molecular-property-prediction-on-toxcast-1PretrainGNN
ROC-AUC: 65.7

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp