Blind quality assessment for screen content images by combining local and global features

Jun Wu, Zhaoqiang Xia, Huiqing Zhang, Huifang Li

科研成果: 期刊稿件文章同行评审

18 引用 (Scopus)

摘要

Recently, several no-reference image quality assessment (NR-IQA) metrics have been developed for the quality evaluation of screen content images (SCIs). While, most of them are opinion-aware methods, which are limited by the subjective opinion scores of training data. Hence, in this paper, we propose a novel opinion-unaware method to predict the quality of SCIs without any prior information. Firstly, an union feature is proposed by considering the local and global visual characteristics of human visual system simultaneously. Specifically, a local structural feature is extracted from the rough and smooth regions of SCIs by leveraging a sparse representation model. As a supplement, a global feature is obtained by combining the luminance statistical feature and local binary pattern (LBP) feature of entire SCIs. Secondly, to get rid of the limitation of subjective opinion scores, a new large-scale training dataset contained 80,000 distorted SCIs is constructed, and the quality labels of those distorted SCIs are derived by an advanced full-reference IQA metric. Thirdly, a regression model between image features and image quality labels is learned from the training dataset by employing a learning-based framework. And then, the quality scores of test SCIs can be predicted by the pre-trained regression model. The experimental results on two largest SCI-oriented databases show that the proposed method is superior to the state-of-the-art NR-IQA metrics.

源语言英语
页(从-至)31-40
页数10
期刊Digital Signal Processing: A Review Journal
91
DOI
出版状态已出版 - 8月 2019

指纹

探究 'Blind quality assessment for screen content images by combining local and global features' 的科研主题。它们共同构成独一无二的指纹。

引用此