HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Supervised Contrastive Learning

Prannay Khosla Piotr Teterwak Chen Wang Aaron Sarna Yonglong Tian Phillip Isola Aaron Maschinot Ce Liu Dilip Krishnan

Supervised Contrastive Learning

Abstract

Contrastive learning applied to self-supervised representation learning has seen a resurgence in recent years, leading to state of the art performance in the unsupervised training of deep image models. Modern batch contrastive approaches subsume or significantly outperform traditional contrastive losses such as triplet, max-margin and the N-pairs loss. In this work, we extend the self-supervised batch contrastive approach to the fully-supervised setting, allowing us to effectively leverage label information. Clusters of points belonging to the same class are pulled together in embedding space, while simultaneously pushing apart clusters of samples from different classes. We analyze two possible versions of the supervised contrastive (SupCon) loss, identifying the best-performing formulation of the loss. On ResNet-200, we achieve top-1 accuracy of 81.4% on the ImageNet dataset, which is 0.8% above the best number reported for this architecture. We show consistent outperformance over cross-entropy on other datasets and two ResNet variants. The loss shows benefits for robustness to natural corruptions and is more stable to hyperparameter settings such as optimizers and data augmentations. Our loss function is simple to implement, and reference TensorFlow code is released at https://t.ly/supcon.

Code Repositories

hannaiiyanggit/unicon
pytorch
Mentioned in GitHub
alexk1704/scclv2
tf
Mentioned in GitHub
delyan-boychev/grad-cache-con-learning
pytorch
Mentioned in GitHub
XG293/SupConLoss
pytorch
Mentioned in GitHub
HobbitLong/SupContrast
Official
pytorch
Mentioned in GitHub
renato145/ContrastiveLoss
pytorch
Mentioned in GitHub
guerbet-ai/wsp-contrastive
pytorch
Mentioned in GitHub
davidczy/supcon_gamma
pytorch
Mentioned in GitHub
forcesh/SupContrast
pytorch
Mentioned in GitHub
ilyassmoummad/ProtoCLR
pytorch
Mentioned in GitHub
caesarea38/doclangid
pytorch
Mentioned in GitHub
hooman650/supcl-seq
pytorch
Mentioned in GitHub
vk1996/contrastive_learning
tf
Mentioned in GitHub
salarim/Semi-Supervised-CL
pytorch
Mentioned in GitHub
ZIZUN/CPFT
pytorch
Mentioned in GitHub
sidtandon2014/fw-shapley
pytorch
Mentioned in GitHub
flyingsheepbin/pet-biometrics
pytorch
Mentioned in GitHub
Liut2016/ecg-supcontrast
pytorch
Mentioned in GitHub
PaperCodeReview/SupCL-TF
tf
Mentioned in GitHub
uiuctml/HypStructure
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
class-incremental-learning-on-cifar100SCR
10-stage average accuracy: 65.98
image-classification-on-imagenetResNet-200 (Supervised Contrastive)
Hardware Burden:
Operations per network pass:
Top 1 Accuracy: 80.8%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp