A Semantic Segmentation-Based GNSS Signal Occlusion Detection and Optimization Method

  • Zhe Yue
  • , Chenchen Sun
  • , Xuerong Zhang
  • , Chengkai Tang
  • , Yuting Gao
  • , Kezhao Li

Research output: Contribution to journalArticlepeer-review

Abstract

Existing research fails to effectively address the problem of increased GNSS positioning errors caused by non-line-of-sight (NLOS) and line-of-sight (LOS) signal attenuation due to obstructions such as buildings and trees in complex urban environments. To address this issue, we dig into the environmental perception perspective to propose a semantic segmentation-based GNSS signal occlusion detection and optimization method. The approach distinguishes between building and tree occlusions and adjusts signal weights accordingly to enhance positioning accuracy. First, a fisheye camera captures environmental imagery above the vehicle, which is then processed using deep learning to segment sky, tree, and building regions. Subsequently, satellite projections are mapped onto the segmented sky image to classify signal occlusions. Then, based on the type of obstruction, a dynamic weight optimization model is constructed to adjust the contribution of each satellite in the positioning solution, thereby enhancing the positioning accuracy of vehicle-navigation in urban environments. Finally, we construct a vehicle-mounted navigation system for experimentation. The experimental results demonstrate that the proposed method enhances accuracy by 16% and 10% compared to the existing GNSS/INS/Canny and GNSS/INS/Flood Fill methods, respectively, confirming its effectiveness in complex urban environments.

Original languageEnglish
Article number2725
JournalRemote Sensing
Volume17
Issue number15
DOIs
StatePublished - Aug 2025

Keywords

  • GNSS
  • semantic segmentation
  • upward-facing camera
  • vehicle navigation

Fingerprint

Dive into the research topics of 'A Semantic Segmentation-Based GNSS Signal Occlusion Detection and Optimization Method'. Together they form a unique fingerprint.

Cite this