TY - GEN
T1 - Visual-Tactile Based Manipulation Control for Complex Robot Manipulation Process via Geometric and Contact State Modeling
AU - Wang, Gaozhao
AU - Liu, Xing
AU - Liu, Zhengxiong
AU - Zhang, Yizhai
AU - Huang, Panfeng
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The accurate representation of object shape, pose, and tool-object contact states is of paramount importance in complex robot tool manipulation tasks. These facets are comprehensively modeled using tactile sensors, which provide a wealth of information. The tactile-based manipulation control framework presented here consists of three core components: the explorer, modeler, and controller. In the context of contact state modeling, we collect the moving trend between the grasped object and the visual-tactile sensor, which could be used for estimating the contact state. Addressing the challenge of precise object modeling, including both shape and pose, we employ the visual-tactile joint exploration and geometric modeling approach introduced in this study. The controller leverages the modeled object's shape and pose while estimating contact states through the contact state classifier. In scenarios where contact states exhibit uncertainty, the robot autonomously engages in exploration, returning to its established contact mode to successfully conclude the manipulation. Peg-hole-insertion experiments featuring pegs and holes of varying shapes have been conducted to empirically validate the efficacy of the presented manipulation control framework.
AB - The accurate representation of object shape, pose, and tool-object contact states is of paramount importance in complex robot tool manipulation tasks. These facets are comprehensively modeled using tactile sensors, which provide a wealth of information. The tactile-based manipulation control framework presented here consists of three core components: the explorer, modeler, and controller. In the context of contact state modeling, we collect the moving trend between the grasped object and the visual-tactile sensor, which could be used for estimating the contact state. Addressing the challenge of precise object modeling, including both shape and pose, we employ the visual-tactile joint exploration and geometric modeling approach introduced in this study. The controller leverages the modeled object's shape and pose while estimating contact states through the contact state classifier. In scenarios where contact states exhibit uncertainty, the robot autonomously engages in exploration, returning to its established contact mode to successfully conclude the manipulation. Peg-hole-insertion experiments featuring pegs and holes of varying shapes have been conducted to empirically validate the efficacy of the presented manipulation control framework.
UR - http://www.scopus.com/inward/record.url?scp=85208019033&partnerID=8YFLogxK
U2 - 10.1109/ICARM62033.2024.10715919
DO - 10.1109/ICARM62033.2024.10715919
M3 - 会议稿件
AN - SCOPUS:85208019033
T3 - ICARM 2024 - 2024 9th IEEE International Conference on Advanced Robotics and Mechatronics
SP - 302
EP - 308
BT - ICARM 2024 - 2024 9th IEEE International Conference on Advanced Robotics and Mechatronics
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th IEEE International Conference on Advanced Robotics and Mechatronics, ICARM 2024
Y2 - 8 July 2024 through 10 July 2024
ER -