Region-based parallax-tolerant image stitching

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

Image Stitching with large parallax has always been a challenging task, and accurate image alignment is critical for stitching results. In this paper, an image stitching method based on superpixel segmentation regions is proposed. To solve the problem of insufficient matching feature points under large parallax, an improved multi-plane RANSAC method is used to improve the robustness of matching feature selection algorithm. In terms of image alignment, a mesh optimization method with the global similarity prior is adopted, and a superpixel-based segmentation method is used to obtain reasonable matching points and global similarity transformation parameters. A standard seam-cutting algorithm is finally used to compose images together. Experiments show that the proposed method can effectively improve the performance of image stitching in complex scenes with large parallax.

Original languageEnglish
Title of host publicationTenth International Conference on Graphics and Image Processing, ICGIP 2018
EditorsChunming Li, Hui Yu, Zhigeng Pan, Yifei Pu
PublisherSPIE
ISBN (Electronic)9781510628281
DOIs
StatePublished - 2019
Externally publishedYes
Event10th International Conference on Graphics and Image Processing, ICGIP 2018 - Chengdu, China
Duration: 12 Dec 201814 Dec 2018

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume11069
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

Conference10th International Conference on Graphics and Image Processing, ICGIP 2018
Country/TerritoryChina
CityChengdu
Period12/12/1814/12/18

Keywords

  • Global similarity transformation
  • Image alignment
  • Image stitching
  • Large parallax
  • Seam-cutting
  • Superpixel

Fingerprint

Dive into the research topics of 'Region-based parallax-tolerant image stitching'. Together they form a unique fingerprint.

Cite this