Robust target feature extraction based on modified cochlear filter analysis model

Yaozhen Wu, Yixin Yang, Feng Tian, Long Yang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

The performance of underwater target recognition systems depends on the consistency and adaptation of target feature in complex conditions during the training and testing stages. In this paper, we investigated the target feature extraction problems in a complex underwater environment and proposed a novel approach based on modified cochlear filter analysis model of the human auditory system. The frequency responses and distributions of the modified cochlear filter bank are similar to that of the basilar membrane in the cochlea. The fast forward and inverse transforms of modified cochlear filter bank are also presented for discrete-time signals to save the computational load. The experimental results using real ship radiated noise data verified that the modified cochlear filter analysis model is robust in background noise.

Original languageEnglish
Title of host publication2013 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2013
DOIs
StatePublished - 2013
Event2013 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2013 - Kunming, Yunnan, China
Duration: 5 Aug 20138 Aug 2013

Publication series

Name2013 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2013

Conference

Conference2013 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2013
Country/TerritoryChina
CityKunming, Yunnan
Period5/08/138/08/13

Keywords

  • Auditory perception
  • Feature extraction
  • Gammachirp filter
  • Modified cochlear filter analysis
  • Target recognition

Fingerprint

Dive into the research topics of 'Robust target feature extraction based on modified cochlear filter analysis model'. Together they form a unique fingerprint.

Cite this