Lighting alignment for image sequences

Xiaoyue Jiang, Xiaoyi Feng, Jun Wu, Jinye Peng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Lighting is one of the challenges for image processing. Even though some algorithms are proposed to deal with the lighting variation for images, most of them are designed for a single image but not for image sequences. In fact, the correlation between frames can provide useful information to remove the illumination diversity, which is not available for a single image. In this paper, we proposed a 2-step lighting alignment algorithm for image sequences. Based on entropy, a perception-based lighting model is initialized according to the lighting condition of first frame. Then the difference between frames is applied to optimize the parameters of the lighting model and consequently the lighting conditions can be aligned for the sequence. At the same time, the local features of each frame can be enhanced. Experimental results show the effectiveness of the proposed algorithm.

Original languageEnglish
Title of host publicationImage and Graphics - 8th International Conference, ICIG 2015, Proceedings
EditorsYu-Jin Zhang
PublisherSpringer Verlag
Pages462-474
Number of pages13
ISBN (Print)9783319219622
DOIs
StatePublished - 2015
Event8th International Conference on Image and Graphics, ICIG 2015 - Tianjin, China
Duration: 13 Aug 201516 Aug 2015

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9218
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference8th International Conference on Image and Graphics, ICIG 2015
Country/TerritoryChina
CityTianjin
Period13/08/1516/08/15

Keywords

  • Entropy-based model
  • Lighting alignment
  • Perceptionbased model
  • Tracking

Fingerprint

Dive into the research topics of 'Lighting alignment for image sequences'. Together they form a unique fingerprint.

Cite this