Skip to content

Latest commit

 

History

History
25 lines (13 loc) · 1.65 KB

README.md

File metadata and controls

25 lines (13 loc) · 1.65 KB

SPIE-2023-Supplementary

UGC Quality Assessment: Exploring the Impact of Saliency in Deep Feature-Based Quality Assessment

Author(s):

Xinyi Wang, Angeliki Katsenou, and David Bull

Visual Information Laboratory, University of Bristol, BS1 5DD, Bristol, UK

Contact: xinyi.wang@bristol.ac.uk

Conference Paper Homepage: Applications of Digital Image Processing XLVI

arXiv Paper page: arXiv:2308.06853: UGC Quality Assessment: Exploring the Impact of Saliency in Deep Feature-Based Quality Assessment

Abstract

The volume of User Generated Content (UGC) has increased in recent years. The challenge with this type of content is assessing its quality. So far, the state-of-the-art metrics are not exhibiting a very high correlation with perceptual quality. In this paper, we explore state-of-the-art metrics that extract/combine natural scene statistics and deep neural network features. We experiment with these by introducing saliency maps to improve perceptibility. We train and test our models using public datasets, namely, YouTube-UGC and KoNViD-1k. Preliminary results indicate that high correlations are achieved by using only deep features while adding saliency does not always boost performance. Our results and code will be made publicly available to serve as a benchmark for the research community and can be found on our project page: https://github.com/xinyiW915/SPIE-2023-Supplementary.

For RAPIQUE_VSFA_Saliency code: (https://github.com/xinyiW915/RVS-resize)