Command Palette
Search for a command to run...
Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning
Mehdi Sajjadi; Mehran Javanmardi; Tolga Tasdizen

Abstract
Effective convolutional neural networks are trained on large sets of labeled data. However, creating large labeled datasets is a very costly and time-consuming task. Semi-supervised learning uses unlabeled data to train a model with higher accuracy when there is a limited set of labeled data available. In this paper, we consider the problem of semi-supervised learning with convolutional neural networks. Techniques such as randomized data augmentation, dropout and random max-pooling provide better generalization and stability for classifiers that are trained using gradient descent. Multiple passes of an individual sample through the network might lead to different predictions due to the non-deterministic behavior of these techniques. We propose an unsupervised loss function that takes advantage of the stochastic nature of these methods and minimizes the difference between the predictions of multiple passes of a training sample through the network. We evaluate the proposed method on several benchmark datasets.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| semi-supervised-image-classification-on-cifar-2 | Ⅱ-Model | Percentage error: 39.19 |
| semi-supervised-image-classification-on-svhn-1 | Ⅱ-model | Accuracy: 82.35 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.