HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Semantic Self-adaptation: Enhancing Generalization with a Single Sample

Sherwin Bahmani Oliver Hahn Eduard Zamfir Nikita Araslanov Daniel Cremers Stefan Roth

Semantic Self-adaptation: Enhancing Generalization with a Single Sample

Abstract

The lack of out-of-domain generalization is a critical weakness of deep networks for semantic segmentation. Previous studies relied on the assumption of a static model, i. e., once the training process is complete, model parameters remain fixed at test time. In this work, we challenge this premise with a self-adaptive approach for semantic segmentation that adjusts the inference process to each input sample. Self-adaptation operates on two levels. First, it fine-tunes the parameters of convolutional layers to the input image using consistency regularization. Second, in Batch Normalization layers, self-adaptation interpolates between the training and the reference distribution derived from a single test sample. Despite both techniques being well known in the literature, their combination sets new state-of-the-art accuracy on synthetic-to-real generalization benchmarks. Our empirical study suggests that self-adaptation may complement the established practice of model regularization at training time for improving deep network generalization to out-of-domain data. Our code and pre-trained models are available at https://github.com/visinf/self-adaptive.

Code Repositories

visinf/self-adaptive
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
domain-generalization-on-gta-to-avgSelf-adaptation (ResNet - 101)
mIoU: 44,89
domain-generalization-on-gta-to-avgSelf-adaptation (ResNet - 50)
mIoU: 44,07
domain-generalization-on-gta5-to-cityscapesSelf-adaptation (ResNet - 101)
mIoU: 46.99

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp