Visual odometry and scene matching integrated navigation system in UAV

Chunhui Zhao, Rongzhi Wang, Tianwu Zhang, Quan Pan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Based on the real-time image sequence of airborne, a novel vision-based integrated navigation method of UAV which combined the relative and absolute navigation algorithms is proposed. Visual odometry is used as the relative algorithm in order to estimate the position of UAV by the derived homography matrix between two consecutive images. RANSAC algorithm is working to remove mismatching points. To eliminate the serious error accumulation of visual odometry and to improve the positioning accuracy and reliability, absolute method based on scene matching using FREAK descriptor is introduced and to build an integrated navigation system. By the data collected by semi-physical simulation software, the method is tested. The data indicates that the proposed method can achieve relatively high position accuracy and can act as an effective navigation method in UAV navigation system.

Original languageEnglish
Title of host publicationFUSION 2014 - 17th International Conference on Information Fusion
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9788490123553
StatePublished - 3 Oct 2014
Event17th International Conference on Information Fusion, FUSION 2014 - Salamanca, Spain
Duration: 7 Jul 201410 Jul 2014

Publication series

NameFUSION 2014 - 17th International Conference on Information Fusion

Conference

Conference17th International Conference on Information Fusion, FUSION 2014
Country/TerritorySpain
CitySalamanca
Period7/07/1410/07/14

Keywords

  • Integrated Navigation
  • Scene Matching
  • UAV
  • Visual Odometry

Fingerprint

Dive into the research topics of 'Visual odometry and scene matching integrated navigation system in UAV'. Together they form a unique fingerprint.

Cite this