Lens distortion correction method based on linear approximation

Min Qi, Hongjuan Xin, Ke Li, Yangyu Fan, Yong Dong, Zhichao Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Aiming at the influence of lens distortion to imaging measurement system, this paper puts forward an improved method for lens distortion correction based on linear approximation. Firstly, the method solves the radial distortion coefficient linearly based on the principle of cross-ratio invariance under perspective projection. Then a higher precision of the straight line is obtained according to the orthogonal distance fitting. Finally, a comprehensive distortion index function is built to get the distortion coefficient via Gauss-Newton method. Simulation results show that the average correcting error of this method is less than 0.2 pixel, in the case of 800 control points and 0.1 noise standard deviation. With the characteristic of easy to implement, strong robustness and high precision, the method proposed in this paper is suitable for high precision correction in machine vision.

Original languageEnglish
Title of host publication2015 International Conference on Computer and Computational Sciences, ICCCS 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages75-78
Number of pages4
ISBN (Electronic)9781479918195
DOIs
StatePublished - 18 Dec 2015
EventInternational Conference on Computer and Computational Sciences, ICCCS 2015 - Greater Noida, India
Duration: 27 Jan 201529 Jan 2015

Publication series

Name2015 International Conference on Computer and Computational Sciences, ICCCS 2015

Conference

ConferenceInternational Conference on Computer and Computational Sciences, ICCCS 2015
Country/TerritoryIndia
CityGreater Noida
Period27/01/1529/01/15

Keywords

  • cross-ratio invariance
  • Gauss-Newton
  • index function of distortion
  • orthogonal distance

Fingerprint

Dive into the research topics of 'Lens distortion correction method based on linear approximation'. Together they form a unique fingerprint.

Cite this