Affordance-Driven Next-Best-View Planning for Robotic Grasping

Xuechao Zhang, Dong Wang, Sun Han, Weichuang Li, Bin Zhao, Zhigang Wang, Xiaoming Duan, Chongrong Fang, Xuelong Li, Jianping He

科研成果: 期刊稿件会议文章同行评审

1 引用 (Scopus)

摘要

Grasping occluded objects in cluttered environments is an essential component in complex robotic manipulation tasks. In this paper, we introduce an AffordanCE-driven Next-Best-View planning policy (ACE-NBV) that tries to find a feasible grasp for target object via continuously observing scenes from new viewpoints. This policy is motivated by the observation that the grasp affordances of an occluded object can be better-measured under the view when the view-direction are the same as the grasp view. Specifically, our method leverages the paradigm of novel view imagery to predict the grasps affordances under previously unobserved view, and select next observation view based on the highest imagined grasp quality of the target object. The experimental results in simulation and on a real robot demonstrate the effectiveness of the proposed affordance-driven next-best-view planning policy. Project page: https://sszxc.net/ace-nbv/.

源语言英语
期刊Proceedings of Machine Learning Research
229
出版状态已出版 - 2023
已对外发布
活动7th Conference on Robot Learning, CoRL 2023 - Atlanta, 美国
期限: 6 11月 20239 11月 2023

指纹

探究 'Affordance-Driven Next-Best-View Planning for Robotic Grasping' 的科研主题。它们共同构成独一无二的指纹。

引用此