Command Palette
Search for a command to run...
Berg Axel ; Oskarsson Magnus ; O'Connor Mark

Abstract
Regression via classification (RvC) is a common method used for regressionproblems in deep learning, where the target variable belongs to a set ofcontinuous values. By discretizing the target into a set of non-overlappingclasses, it has been shown that training a classifier can improve neuralnetwork accuracy compared to using a standard regression approach. However, itis not clear how the set of discrete classes should be chosen and how itaffects the overall solution. In this work, we propose that using severaldiscrete data representations simultaneously can improve neural networklearning compared to a single representation. Our approach is end-to-enddifferentiable and can be added as a simple extension to conventional learningmethods, such as deep neural networks. We test our method on three challengingtasks and show that our method reduces the prediction error compared to abaseline RvC approach while maintaining a similar model complexity.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| age-estimation-on-utkface | Randomized Bins | MAE: 4.55 |
| head-pose-estimation-on-biwi | Direct Regression | MAE (trained with BIWI data): 2.54 |
| historical-color-image-dating-on-hci | Label Diversity | MAE: 0.67 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.