Real-time Depth Estimation for Aerial Panoramas in Virtual Reality

Di Xu, Xiaojun Liu, Yanning Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

With the emergence of consumer-level 360° panoramic cameras, omnidirectional RGB images and videos are now easy to be captured either in the hand-held mode or from a drone. Although previous works achieve plausible results in room-sized panoramic datasets, they are limited in indoor scenes with little rotations. In this paper, we present a real-time depth estimation method for more challenging aerial panoramas, where the viewing angle is changing rapidly and the lighting condition is more complicated. Our graph convolutional network(GCN)-based framework makes full use of the global connection information of the omnidirectional images, and is trained with extensive outdoor data. Experiments show that our method is robust to estimate the depth of outdoor aerial panoramas captured from various angles accurately.

Original languageEnglish
Title of host publicationProceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages705-706
Number of pages2
ISBN (Electronic)9781728165325
DOIs
StatePublished - Mar 2020
Event2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020 - Atlanta, United States
Duration: 22 Mar 202026 Mar 2020

Publication series

NameProceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020

Conference

Conference2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020
Country/TerritoryUnited States
CityAtlanta
Period22/03/2026/03/20

Keywords

  • Computer Graphics
  • Computing methodologies
  • Graphics systems and interface
  • Virtual reality

Fingerprint

Dive into the research topics of 'Real-time Depth Estimation for Aerial Panoramas in Virtual Reality'. Together they form a unique fingerprint.

Cite this