HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Attention-Based Multi-Context Guiding for Few-Shot Semantic Segmentation

{Gang Yu Pengwan Yang Tao Hu Chiliang Zhang Yadong Mu Cees G. M. Snoek}

Abstract

Few-shot learning is a nascent research topic, motivated by the fact that traditional deep learning methods require tremen- dous amounts of data. The scarcity of annotated data becomes even more challenging in semantic segmentation since pixel- level annotation in segmentation task is more labor-intensive to acquire. To tackle this issue, we propose an Attention- based Multi-Context Guiding (A-MCG) network, which con- sists of three branches: the support branch, the query branch, the feature fusion branch. A key differentiator of A-MCG is the integration of multi-scale context features between sup- port and query branches, enforcing a better guidance from the support set. In addition, we also adopt a spatial atten- tion along the fusion branch to highlight context information from several scales, enhancing self-supervision in one-shot learning. To address the fusion problem in multi-shot learn- ing, Conv-LSTM is adopted to collaboratively integrate the sequential support features to elevate the final accuracy. Our architecture obtains state-of-the-art on unseen classes in a variant of PASCAL VOC12 dataset and performs favorably against previous work with large gains of 1.1%, 1.4% mea- sured in mIoU in the 1-shot and 5-shot setting.

Benchmarks

BenchmarkMethodologyMetrics
few-shot-semantic-segmentation-on-pascal5i-1A-MCG-Conv-LSTM
meanIOU: 62.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Attention-Based Multi-Context Guiding for Few-Shot Semantic Segmentation | Papers | HyperAI