3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration

Peng Wang, Xiaoliang Bai, Mark Billinghurst, Shusheng Zhang, Sili Wei, Guangyao Xu, Weiping He, Xiangyu Zhang, Jie Zhang

Research output: Contribution to journalArticlepeer-review

52 Scopus citations

Abstract

As Virtual Reality(VR), Augmented Reality(AR), Mixed Reality(MR) technology becomes more accessible, it is important to explore VR/AR/MR technologies that can be used for remote collaboration on physical tasks. Previous research has shown that gesture-based interaction is intuitive and expressive for remote collaboration, and using 3D CAD models can provide clear instructions for assembly tasks. In this paper, therefore, we describe a new MR remote collaboration system which combines the use of gesture and CAD models in a complementary manner. The prototype system enables a remote expert in VR to provide instructions based on 3D gesture and CAD models (3DGAM) for a local worker who uses AR to see these instructions. Using this interface, we conducted a formal user study to explore the effect of sharing 3D gesture and CAD models in an assembly training task. We found that the combination of 3D gesture and CAD models can improve remote collaboration on an assembly task with respect to the performance time and user experience. Finally, we provide some conclusions and directions for future research.

Original languageEnglish
Pages (from-to)31059-31084
Number of pages26
JournalMultimedia Tools and Applications
Volume80
Issue number20
DOIs
StatePublished - Aug 2021

Keywords

  • 3D CAD models
  • Augmented reality
  • Mixed reality
  • Physical tasks
  • Remote collaboration
  • Sharing gesture

Fingerprint

Dive into the research topics of '3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration'. Together they form a unique fingerprint.

Cite this