An efficient algorithm for human body matting with RGB-D data

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

Among the large-screen intelligent interactive applications, such as augmented reality (AR) and mediated reality (MR), the background removal technique can provide users with a better sense of immersion and participation. However, there are two critical problem with existing image segmentation or matting methods, one is segmentation speed and the other is in complex background, so the background removal technique is difficult to practice. This paper proposes an efficient body matting algorithm combines KinectV2 color data stream (1920*1080 resolution) and depth data stream (512*424 resolution). Firstly, we get rough segmentation from map body data to color data and denoise the map result, secondly, we get trimap use new way which faster than traditional method, especially at high resolution, thirdly, we use our trimap to get final body matting result. We experiment our way in general computer with the CPU i5-4460 3.2GHz and NVIDIA GeForce GT 730 graphics card, the processing speed of our algorithm to matting 1920*1080 color image is slower than 60ms/frame.

Original languageEnglish
Title of host publicationICVR 2018 - Proceedings of the 4th International Conference on Virtual Reality
PublisherAssociation for Computing Machinery
Pages40-43
Number of pages4
ISBN (Electronic)9781450364089
DOIs
StatePublished - 24 Feb 2018
Event4th International Conference on Virtual Reality, ICVR 2018 - Hong Kong, China
Duration: 24 Feb 201826 Feb 2018

Publication series

NameACM International Conference Proceeding Series

Conference

Conference4th International Conference on Virtual Reality, ICVR 2018
Country/TerritoryChina
CityHong Kong
Period24/02/1826/02/18

Keywords

  • High-Resolution
  • Kinect
  • Matting
  • Trimap

Fingerprint

Dive into the research topics of 'An efficient algorithm for human body matting with RGB-D data'. Together they form a unique fingerprint.

Cite this