HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Batch Transformer: Look for Attention in Batch

Myung Beom Her Jisu Jeong Hojoon Song Ji-Hyeong Han

Batch Transformer: Look for Attention in Batch

Abstract

Facial expression recognition (FER) has received considerable attention in computer vision, with "in-the-wild" environments such as human-computer interaction. However, FER images contain uncertainties such as occlusion, low resolution, pose variation, illumination variation, and subjectivity, which includes some expressions that do not match the target label. Consequently, little information is obtained from a noisy single image and it is not trusted. This could significantly degrade the performance of the FER task. To address this issue, we propose a batch transformer (BT), which consists of the proposed class batch attention (CBA) module, to prevent overfitting in noisy data and extract trustworthy information by training on features reflected from several images in a batch, rather than information from a single image. We also propose multi-level attention (MLA) to prevent overfitting the specific features by capturing correlations between each level. In this paper, we present a batch transformer network (BTN) that combines the above proposals. Experimental results on various FER benchmark datasets show that the proposed BTN consistently outperforms the state-ofthe-art in FER datasets. Representative results demonstrate the promise of the proposed BTN for FER.

Benchmarks

BenchmarkMethodologyMetrics
facial-expression-recognition-on-affectnetBTN
Accuracy (7 emotion): 67.60
Accuracy (8 emotion): 64.29
facial-expression-recognition-on-raf-dbBTN
Avg. Accuracy: 87.3
Overall Accuracy: 92.54

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Batch Transformer: Look for Attention in Batch | Papers | HyperAI