Muti-stage learning for gender and age prediction

  • Jie Fang
  • , Yuan Yuan
  • , Xiaoqiang Lu
  • , Yachuang Feng

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

Automatic gender and age prediction has become relevant to an increasing amount of applications, particularly under the rise of social platforms and social media. However, the performances of existing methods on real-world images are still not satisfactory as we expected, especially when compared to that of face recognition. The reason is that, facial images for gender and age prediction have inherent small inter-class and big intra-class differences, i.e., two images with different skin colors and same age category label have big intra-class difference. However, most existing methods have not constructed discriminative representations for digging out these inherent characteristics very well. In this paper, a method based on muti-stage learning is proposed: The first stage is marking the object regions with an encoder-decoder based segmentation network. Specifically, the segmentation network can classify each pixel into two classes, “people” and others, and only the “people” regions are used for the subsequent processing. The second stage is precisely predicting the gender and age information with the proposed prediction network, which encodes global information, local region information and the interactions among different local regions into the final representation, and then finalizes the prediction. Additionally, we evaluate our method on three public and challenging datasets, and the experimental results verify the effectiveness of our proposed method.

Original languageEnglish
Pages (from-to)114-124
Number of pages11
JournalNeurocomputing
Volume334
DOIs
StatePublished - 21 Mar 2019

Keywords

  • Gender and age prediction
  • Muti-stage learning
  • Segmentation network

Fingerprint

Dive into the research topics of 'Muti-stage learning for gender and age prediction'. Together they form a unique fingerprint.

Cite this