论文标题
学习移动操作
Learning Mobile Manipulation
论文作者
论文摘要
尽管进行了数十年的研究,但提供移动机器人可以操纵物体的能力仍然是一个具有挑战性的问题。在有限的环境中,对环境布局和可操作对象有充分的知识,该问题很容易接近。面临的挑战在于建立超出特定情况实例并在新型条件下优雅运行的系统。过去,研究人员使用启发式和简单的基于规则的策略来完成诸如场景细分或有关遮挡的推理之类的任务。这些启发式策略在受约束的环境中起作用,在这种环境中,机器人主义者可以简化有关所有事物的假设,从要与要交互的对象的几何形状,混乱,相机位置,照明以及无数其他相关变量。本文中的工作将展示如何建立一个用于机器人移动操作的系统,该系统可用于这些变量的变化。在大数据,深度学习和仿真领域的最新进步将实现这种鲁棒性。模拟器创建现实的感官数据的能力使得为各种基于掌握和导航任务的标记培训数据的大规模语料库生成。现在,可以构建在现实世界中使用完全了解合成数据的深度学习的现实世界中工作的系统。训练和测试合成数据的能力允许快速迭代开发在许多环境中起作用的新知觉,计划和掌握执行算法。
Providing mobile robots with the ability to manipulate objects has, despite decades of research, remained a challenging problem. The problem is approachable in constrained environments where there is ample prior knowledge of the environment layout and manipulatable objects. The challenge is in building systems that scale beyond specific situational instances and gracefully operate in novel conditions. In the past, researchers used heuristic and simple rule-based strategies to accomplish tasks such as scene segmentation or reasoning about occlusion. These heuristic strategies work in constrained environments where a roboticist can make simplifying assumptions about everything from the geometries of the objects to be interacted with, level of clutter, camera position, lighting, and a myriad of other relevant variables. The work in this thesis will demonstrate how to build a system for robotic mobile manipulation that is robust to changes in these variables. This robustness will be enabled by recent simultaneous advances in the fields of big data, deep learning, and simulation. The ability of simulators to create realistic sensory data enables the generation of massive corpora of labeled training data for various grasping and navigation-based tasks. It is now possible to build systems that work in the real world trained using deep learning entirely on synthetic data. The ability to train and test on synthetic data allows for quick iterative development of new perception, planning and grasp execution algorithms that work in many environments.