Command Palette
Search for a command to run...
A Unified Approach of Multi-scale Deep and Hand-crafted Features for Defocus Estimation
Park Jinsun Tai Yu-Wing Cho Donghyeon Kweon In So

Abstract
In this paper, we introduce robust and synergetic hand-crafted features and asimple but efficient deep feature from a convolutional neural network (CNN)architecture for defocus estimation. This paper systematically analyzes theeffectiveness of different features, and shows how each feature can compensatefor the weaknesses of other features when they are concatenated. For a fulldefocus map estimation, we extract image patches on strong edges sparsely,after which we use them for deep and hand-crafted feature extraction. In orderto reduce the degree of patch-scale dependency, we also propose a multi-scalepatch extraction strategy. A sparse defocus map is generated using a neuralnetwork classifier followed by a probability-joint bilateral filter. The finaldefocus map is obtained from the sparse defocus map with guidance from anedge-preserving filtered input image. Experimental results show that ouralgorithm is superior to state-of-the-art algorithms in terms of defocusestimation. Our work can be used for applications such as segmentation, blurmagnification, all-in-focus image generation, and 3-D estimation.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| defocus-estimation-on-cuhk-blur-detection | DHDE | Blur Segmentation Accuracy: 83.73 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.