Command Palette
Search for a command to run...

Abstract
In this paper we investigate image classification with computational resourcelimits at test time. Two such settings are: 1. anytime classification, wherethe network's prediction for a test example is progressively updated,facilitating the output of a prediction at any time; and 2. budgeted batchclassification, where a fixed amount of computation is available to classify aset of examples that can be spent unevenly across "easier" and "harder" inputs.In contrast to most prior work, such as the popular Viola and Jones algorithm,our approach is based on convolutional neural networks. We train multipleclassifiers with varying resource demands, which we adaptively apply duringtest time. To maximally re-use computation between the classifiers, weincorporate them as early-exits into a single deep convolutional neural networkand inter-connect them with dense connectivity. To facilitate high qualityclassification early on, we use a two-dimensional multi-scale networkarchitecture that maintains coarse and fine level features all-throughout thenetwork. Experiments on three image-classification tasks demonstrate that ourframework substantially improves the existing state-of-the-art in bothsettings.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| handwritten-mathmatical-expression-1 | DenseWAP-MSA | ExpRate: 50.1 |
| handwritten-mathmatical-expression-1 | DenseWAP | ExpRate: 47.5 |
| handwritten-mathmatical-expression-2 | DenseWAP-MSA | ExpRate: 47.7 |
| handwritten-mathmatical-expression-3 | DenseWAP | ExpRate: 61.85 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.