论文标题

Anygrasp:在空间和时间域中的强大而有效的掌握感知

AnyGrasp: Robust and Efficient Grasp Perception in Spatial and Temporal Domains

论文作者

Fang, Hao-Shu, Wang, Chenxi, Fang, Hongjie, Gou, Minghao, Liu, Jirong, Yan, Hengxu, Liu, Wenhai, Xie, Yichen, Lu, Cewu

论文摘要

作为预先操纵的基础,至关重要的是,使机器人能够像人类一样坚强地掌握。我们先天的抓握系统是迅速,准确,灵活的,并且在空间和时间域之间连续。现有方法涵盖了所有这些用于机器人抓握的属性。在本文中,我们提出了任何掌握感知的内容,以使用平行的抓手来使这些能力使这些能力。具体而言,我们在时空领域中制定了具有真实感知和分析标签的密集监督策略。对物体质量中心质量的额外意识纳入了学习过程中,以帮助提高抓地力稳定性。跨观测的掌握对应关系的利用可以使动态掌握跟踪。我们的模型可以有效地产生准确的,7-DOF,致密和时间平滑的抓握姿势,并在大深度感应噪声上稳健地工作。使用AnyGrasp,我们在清除300多个看不见的物体的垃圾箱时达到了93.3%的成功率,在受控条件下与人类受试者相当。在单臂系统上报告了超过900个平均每小时的平均挑选。为了动态抓握,我们展示了在水中捕获游泳机器人鱼。我们的项目页面位于https://graspnet.net/anygrasp.html

As the basis for prehensile manipulation, it is vital to enable robots to grasp as robustly as humans. Our innate grasping system is prompt, accurate, flexible, and continuous across spatial and temporal domains. Few existing methods cover all these properties for robot grasping. In this paper, we propose AnyGrasp for grasp perception to enable robots these abilities using a parallel gripper. Specifically, we develop a dense supervision strategy with real perception and analytic labels in the spatial-temporal domain. Additional awareness of objects' center-of-mass is incorporated into the learning process to help improve grasping stability. Utilization of grasp correspondence across observations enables dynamic grasp tracking. Our model can efficiently generate accurate, 7-DoF, dense, and temporally-smooth grasp poses and works robustly against large depth-sensing noise. Using AnyGrasp, we achieve a 93.3% success rate when clearing bins with over 300 unseen objects, which is on par with human subjects under controlled conditions. Over 900 mean-picks-per-hour is reported on a single-arm system. For dynamic grasping, we demonstrate catching swimming robot fish in the water. Our project page is at https://graspnet.net/anygrasp.html

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源