Facial expression transfer method based on frequency analysis

Wei Wei, Chunna Tian, Stephen John Maybank, Yanning Zhang

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

We propose a novel expression transfer method based on an analysis of the frequency of multi-expression facial images. We locate the facial features automatically and describe the shape deformations between a neutral expression and non-neutral expressions. The subtle expression changes are important visual clues to distinguish different expressions. These changes are more salient in the frequency domain than in the image domain. We extract the subtle local expression deformations for the source subject, coded in the wavelet decomposition. This information about expressions is transferred to a target subject. The resulting synthesized image preserves both the facial appearance of the target subject and the expression details of the source subject. This method is extended to dynamic expression transfer to allow a more precise interpretation of facial expressions. Experiments on Japanese Female Facial Expression (JAFFE), the extended Cohn-Kanade (CK+) and PIE facial expression databases show the superiority of our method over the state-of-the-art methods.

Original languageEnglish
Pages (from-to)115-128
Number of pages14
JournalPattern Recognition
Volume49
DOIs
StatePublished - 1 Jan 2016

Keywords

  • Expression transfer
  • Facial feature location
  • Frequency domain analysis
  • Warping technique

Fingerprint

Dive into the research topics of 'Facial expression transfer method based on frequency analysis'. Together they form a unique fingerprint.

Cite this