HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

MiCE: Mixture of Contrastive Experts for Unsupervised Image Clustering

Tsung Wei Tsai Chongxuan Li Jun Zhu

MiCE: Mixture of Contrastive Experts for Unsupervised Image Clustering

Abstract

We present Mixture of Contrastive Experts (MiCE), a unified probabilistic clustering framework that simultaneously exploits the discriminative representations learned by contrastive learning and the semantic structures captured by a latent mixture model. Motivated by the mixture of experts, MiCE employs a gating function to partition an unlabeled dataset into subsets according to the latent semantics and multiple experts to discriminate distinct subsets of instances assigned to them in a contrastive learning manner. To solve the nontrivial inference and learning problems caused by the latent variables, we further develop a scalable variant of the Expectation-Maximization (EM) algorithm for MiCE and provide proof of the convergence. Empirically, we evaluate the clustering performance of MiCE on four widely adopted natural image datasets. MiCE achieves significantly better results than various previous methods and a strong contrastive learning baseline.

Code Repositories

TsungWeiTsai/MiCE
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
image-clustering-on-imagenet-dog-15MiCE
ARI: 0.286
Accuracy: 0.439
Image Size: 96
NMI: 0.423
image-clustering-on-stl-10MiCE
Accuracy: 0.752
Backbone: ResNet-34
NMI: 0.635
Train Split: Train+Test

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp