HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning

Karsten Roth Timo Milbich Björn Ommer Joseph Paul Cohen Marzyeh Ghassemi

S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning

Abstract

Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the embedding space dimensionality. Unfortunately, high dimensional embeddings also create higher retrieval cost for downstream applications. To remedy this, we propose \emph{Simultaneous Similarity-based Self-distillation (S2SD). S2SD extends DML with knowledge distillation from auxiliary, high-dimensional embedding and feature spaces to leverage complementary context during training while retaining test-time cost and with negligible changes to the training time. Experiments and ablations across different objectives and standard benchmarks show S2SD offers notable improvements of up to 7% in Recall@1, while also setting a new state-of-the-art. Code available at https://github.com/MLforHealth/S2SD.

Code Repositories

MLforHealth/S2SD
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
metric-learning-on-cars196ResNet50 + S2SD
R@1: 89.5
metric-learning-on-cub-200-2011ResNet50 + S2SD
R@1: 70.1
metric-learning-on-stanford-online-products-1ResNet50 + S2SD
R@1: 81.0

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp