Vision-based relative state estimation for a non-cooperative target

Qian Feng, Yong Liu, Zheng Hong Zhu, Yu Hen Hu, Quan Pan, Yang Lyu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

This paper presents an improved method to estimate the relative pose and motion of a non-cooperative target using stereoscopic vision. Two cameras are mounted on the chaser to acquire images of the target, and then the positions and velocities of these points in the camera frame are yielded by optical flow and photogrammetry. With the coordinates of the feature points in the target frame unknown in prior, the full dynamical states of the target relative to the chaser are estimated coarsely based on the measurements in two consecutive image frames. Then a multiplicative extend Kalman filter is designed to improve the estimation accuracy based on the former coarse results. Numerical simulations are conducted to evaluate the convergence and precision of the proposed algorithm.

Original languageEnglish
Title of host publicationAIAA Guidance, Navigation, and Control
PublisherAmerican Institute of Aeronautics and Astronautics Inc, AIAA
ISBN (Print)9781624105265
DOIs
StatePublished - 1 Jan 2018
EventAIAA Guidance, Navigation, and Control Conference, 2018 - Kissimmee, United States
Duration: 8 Jan 201812 Jan 2018

Publication series

NameAIAA Guidance, Navigation, and Control Conference, 2018

Conference

ConferenceAIAA Guidance, Navigation, and Control Conference, 2018
Country/TerritoryUnited States
CityKissimmee
Period8/01/1812/01/18

Fingerprint

Dive into the research topics of 'Vision-based relative state estimation for a non-cooperative target'. Together they form a unique fingerprint.

Cite this