HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Memory-and-Anticipation Transformer for Online Action Understanding

Wang Jiahao ; Chen Guo ; Huang Yifei ; Wang Limin ; Lu Tong

Memory-and-Anticipation Transformer for Online Action Understanding

Abstract

Most existing forecasting systems are memory-based methods, which attempt tomimic human forecasting ability by employing various memory mechanisms and haveprogressed in temporal modeling for memory dependency. Nevertheless, an obviousweakness of this paradigm is that it can only model limited historicaldependence and can not transcend the past. In this paper, we rethink thetemporal dependence of event evolution and propose a novelmemory-anticipation-based paradigm to model an entire temporal structure,including the past, present, and future. Based on this idea, we presentMemory-and-Anticipation Transformer (MAT), a memory-anticipation-basedapproach, to address the online action detection and anticipation tasks. Inaddition, owing to the inherent superiority of MAT, it can process onlineaction detection and anticipation tasks in a unified manner. The proposed MATmodel is tested on four challenging benchmarks TVSeries, THUMOS'14, HDD, andEPIC-Kitchens-100, for online action detection and anticipation tasks, and itsignificantly outperforms all existing methods. Code is available athttps://github.com/Echo0125/Memory-and-Anticipation-Transformer.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
action-detection-on-thumos-14MAT (ours)
mAP: 58.2
action-detection-on-thumos-14MAT (Ours) Trans
mAP: 71.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp