SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition

Zhuo Wang, Yang Wang, Xiaoliang Bai, Xiangyu Huo, Weiping He, Shuo Feng, Jie Zhang, Yueqing Zhang, Jinzhao Zhou

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

With the development of augmented reality supporting manual assembly collaboration, it is particularly important to explore the transformation from traditional “human-machine” cooperation mode to smart “human-machine” cooperation mode. Early studies have shown that user cues (i.e., head, gesture, eye) and scene cues (i.e., objects, tools, space) are intuitive and highly expressive for traditional AR collaborative mode. However, how to integrate these cues into an assembly system, reasonably infer an operator’s work intention, and then give an appropriate rendering scheme is one of the problems in collaborative assembly. This paper describes a AR collaborative assembly platform: SHARIDEAS. It uses a generalized grey correlation method to integrate user cues and scene cues. The results of data fusion can provide appropriate and intuitive assembly guidance for local workers. A formal user study is to explore the usability and feasibility of SHAREDEAS in a manual assembly task. The experimental data show that SHAREDEAS is more conducive than traditional one to improve the efficiency of human-machine cooperation. Finally, some conclusions of SHARIDEAS are given and the future research direction has prospected.

Original languageEnglish
Pages (from-to)475-486
Number of pages12
JournalInternational Journal of Advanced Manufacturing Technology
Volume115
Issue number1-2
DOIs
StatePublished - Jul 2021

Keywords

  • Augmented reality
  • Collaboration
  • Intention recognition
  • Manual assembly

Fingerprint

Dive into the research topics of 'SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition'. Together they form a unique fingerprint.

Cite this