Objective-oriented efficient robotic manipulation: A novel algorithm for real-time grasping in cluttered scenes

Yufeng Li, Jian Gao, Yimin Chen, Yaozhen He

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Grasping unknown objects in non-structural environments autonomously is challenging for robotic manipulators, primarily due to the variability in environmental conditions and the unpredictable orientations of objects. To address this issue, this paper proposes a grasping algorithm that can segment the target object from a single view of the scene and generate collision-free 6-DOF(Degrees of Freedom) grasping poses. Initially, we develop a YOLO-CMA algorithm for object recognition in dense scenes. Building upon this, a point cloud segmentation algorithm based on object detection algorithm is used to extract the target object from the scene. Following this, a learning network is designed that takes into account both the target point cloud and the global point cloud. This network can achieve grasping pose generation, grasping pose scoring, and grasping pose collision detection. We integrate these grasping candidates with our bespoke online algorithm to generate the most optimal grasping pose. The recognition results in dense scenes demonstrate that the proposed YOLO-CMA structure can achieve better classification. Furthermore, real experimental with a UR3 manipulator results indicate that the proposed method can achieve real-time grasping of objects, achieving a grasping success rate of 88.3% and a completion rate of 93.3% in cluttered environments.

Original languageEnglish
Article number110190
JournalComputers and Electrical Engineering
Volume123
DOIs
StatePublished - Apr 2025

Keywords

  • Deep learning
  • Grasping pose detection
  • Regional point cloud
  • Target detection

Fingerprint

Dive into the research topics of 'Objective-oriented efficient robotic manipulation: A novel algorithm for real-time grasping in cluttered scenes'. Together they form a unique fingerprint.

Cite this