HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

No-Reference Video Quality Assessment Using Space-Time Chips

Joshua P. Ebenezer Zaixi Shang Yongjun Wu Hai Wei Alan C. Bovik

No-Reference Video Quality Assessment Using Space-Time Chips

Abstract

We propose a new prototype model for no-reference video quality assessment (VQA) based on the natural statistics of space-time chips of videos. Space-time chips (ST-chips) are a new, quality-aware feature space which we define as space-time localized cuts of video data in directions that are determined by the local motion flow. We use parametrized distribution fits to the bandpass histograms of space-time chips to characterize quality, and show that the parameters from these models are affected by distortion and can hence be used to objectively predict the quality of videos. Our prototype method, which we call ChipQA-0, is agnostic to the types of distortion affecting the video, and is based on identifying and quantifying deviations from the expected statistics of natural, undistorted ST-chips in order to predict video quality. We train and test our resulting model on several large VQA databases and show that our model achieves high correlation against human judgments of video quality and is competitive with state-of-the-art models.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
video-quality-assessment-on-live-etriChipQA-0
SRCC: 0.4028
video-quality-assessment-on-live-livestreamChipQA-0
SRCC: 0.7513

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
No-Reference Video Quality Assessment Using Space-Time Chips | Papers | HyperAI