HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

Rui Huang Andrew Geng Yixuan Li

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

Abstract

Detecting out-of-distribution (OOD) data has become a critical component in ensuring the safe deployment of machine learning models in the real world. Existing OOD detection approaches primarily rely on the output or feature space for deriving OOD scores, while largely overlooking information from the gradient space. In this paper, we present GradNorm, a simple and effective approach for detecting OOD inputs by utilizing information extracted from the gradient space. GradNorm directly employs the vector norm of gradients, backpropagated from the KL divergence between the softmax output and a uniform probability distribution. Our key idea is that the magnitude of gradients is higher for in-distribution (ID) data than that for OOD data, making it informative for OOD detection. GradNorm demonstrates superior performance, reducing the average FPR95 by up to 16.33% compared to the previous best method.

Code Repositories

deeplearning-wisc/gradnorm_ood
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
out-of-distribution-detection-on-imagenet-1k-10GradNorm (ResNetv2-101)
FPR95: 61.42
out-of-distribution-detection-on-imagenet-1k-3GradNorm
FPR95: 50.03
out-of-distribution-detection-on-imagenet-1k-8GradNorm (ResNetv2-101)
FPR95: 46.48
out-of-distribution-detection-on-imagenet-1k-9GradNorm (ResNetv2-101)
FPR95: 60.86

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp