Going the Extra Mile in Face Image Quality Assessment: A Novel Database and Model

Shaolin Su, Hanhe Lin, Vlad Hosu, Oliver Wiedemann, Jinqiu Sun, Yu Zhu, Hantao Liu, Yanning Zhang, Dietmar Saupe

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

An accurate computational model for image quality assessment (IQA) benefits many vision applications, such as image filtering, image processing, and image generation. Although the study of face images is an important subfield in computer vision research, the lack of face IQA data and models limits the precision of current IQA metrics on face image processing tasks such as face superresolution, face enhancement, and face editing. To narrow this gap, in this article, we first introduce the largest annotated IQA database developed to date, which contains 20,000 human faces - an order of magnitude larger than all existing rated datasets of faces - of diverse individuals in highly varied circumstances. Based on the database, we further propose a novel deep learning model to accurately predict face image quality, which, for the first time, explores the use of generative priors for IQA. By taking advantage of rich statistics encoded in well pretrained off-the-shelf generative models, we obtain generative prior information and use it as latent references to facilitate blind IQA. The experimental results demonstrate both the value of the proposed dataset for face IQA and the superior performance of the proposed model.

Original languageEnglish
Pages (from-to)2671-2685
Number of pages15
JournalIEEE Transactions on Multimedia
Volume26
DOIs
StatePublished - 2024

Keywords

  • face quality
  • GAN
  • generative priors
  • Image quality assessment
  • subjective study

Fingerprint

Dive into the research topics of 'Going the Extra Mile in Face Image Quality Assessment: A Novel Database and Model'. Together they form a unique fingerprint.

Cite this