Command Palette
Search for a command to run...
Rethinking Softmax with Cross-Entropy: Neural Network Classifier as Mutual Information Estimator
Zhenyue Qin Dongwoo Kim Tom Gedeon

Abstract
Mutual information is widely applied to learn latent representations of observations, whilst its implication in classification neural networks remain to be better explained. We show that optimising the parameters of classification neural networks with softmax cross-entropy is equivalent to maximising the mutual information between inputs and labels under the balanced data assumption. Through experiments on synthetic and real datasets, we show that softmax cross-entropy can estimate mutual information approximately. When applied to image classification, this relation helps approximate the point-wise mutual information between an input image and a label without modifying the network structure. To this end, we propose infoCAM, informative class activation map, which highlights regions of the input image that are the most relevant to a given label based on differences in information. The activation map helps localise the target object in an input image. Through experiments on the semi-supervised object localisation task with two real-world datasets, we evaluate the effectiveness of our information-theoretic approach.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| fine-grained-image-classification-on | PC-Softmax | Accuracy: 89.73 Average Per-Class Accuracy: 87.69 |
| image-classification-on-imbalanced-cub-200 | PC-Softmax | Accuracy: 89.73 Average Per-Class Accuracy: 87.69 |
| weakly-supervised-object-localization-on-cub | InfoCAM | Top-1 Error Rate: 54.17 Top-1 Localization Accuracy: 55.83 |
| weakly-supervised-object-localization-on-tiny | InfoCAM | Top-1 Localization Accuracy: 43.34 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.