Video synchronization with trajectory pulse

Xue Wang, Qing Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

This paper presents a method to temporally synchronize two independently moving cameras with overlapping views. Temporal variations between image frames (such as moving objects) are powerful cues for alignment. We first generate pulse images by tracking moving objects and examining the trajectories for changes in speed.We then integrate a rank-based constraint and the pulse-based matching, to derive a robust approximation of spatio-temporal alignment quality for all pairs of frames. By folding both spatial and temporal cues into a single alignment framework, finally, the nonlinear temporal mapping is found using a graph-based approach that supports partial temporal overlap between sequences. We verify the robustness and performance of the proposed approach on several challenging real video sequences. Compared to stateof- the-art techniques, our approach is robust to tracking error and can handle non-rigid scene alignment in complex dynamic scenes.

Original languageEnglish
Title of host publicationIntelligent Visual Surveillance - 4th Chinese Conference, IVS 2016, Proceedings
EditorsZhang Zhang, Kaiqi Huang
PublisherSpringer Verlag
Pages12-19
Number of pages8
ISBN (Print)9789811034756
DOIs
StatePublished - 2016
Event4th Chinese Conference on Intelligent Visual Surveillance, IVS 2016 - Beijing, China
Duration: 19 Oct 201619 Oct 2016

Publication series

NameCommunications in Computer and Information Science
Volume664 CCIS
ISSN (Print)1865-0929

Conference

Conference4th Chinese Conference on Intelligent Visual Surveillance, IVS 2016
Country/TerritoryChina
CityBeijing
Period19/10/1619/10/16

Fingerprint

Dive into the research topics of 'Video synchronization with trajectory pulse'. Together they form a unique fingerprint.

Cite this