QP_TR trust region blob tracking through scale-space with automatic selection of features

Jingping Jia, Qing Wang, Yanmei Chai, Rongchun Zhao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A new approach of tracking objects in image sequences is proposed, in which the constant changes of the size and orientation of the target can be precisely described. For each incoming frame, a likelihood image of the target is created according to the automatically chosen best feature, where the target's area turns into a blob. The scale of this blob can be determined based on the local maxima of differential scale-space filters. We employ the QP_TR trust region algorithm to search for the local maxima of orientational multi-scale normalized Laplacian filter of the likelihood image to locate the target as well as to determine its scale and orientation. Based on the tracking results of sequence examples, the novel method has been proven to be capable of describing the target more accurately and thus achieves much better tracking precision.

Original languageEnglish
Title of host publicationImage Analysis and Recognition - Third International Conference, ICIAR 2006, Proceedings
PublisherSpringer Verlag
Pages862-873
Number of pages12
ISBN (Print)3540448918, 9783540448914
DOIs
StatePublished - 2006
Event3rd International Conference on Image Analysis and Recognition, ICIAR 2006 - Povoa de Varzim, Portugal
Duration: 18 Sep 200620 Sep 2006

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4141 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference3rd International Conference on Image Analysis and Recognition, ICIAR 2006
Country/TerritoryPortugal
CityPovoa de Varzim
Period18/09/0620/09/06

Fingerprint

Dive into the research topics of 'QP_TR trust region blob tracking through scale-space with automatic selection of features'. Together they form a unique fingerprint.

Cite this