论文标题

可靠数据关联的多个假设语义映射

Multiple Hypothesis Semantic Mapping for Robust Data Association

论文作者

Bernreiter, Lukas, Gawel, Abel, Sommer, Hannes, Nieto, Juan, Siegwart, Roland, Cadena, Cesar

论文摘要

在本文中,我们提出了一种语义映射方法,并具有多个用于数据关联的假设跟踪。由于语义信息有可能克服测量和识别的歧义,因此它构成了自主系统的重要方式。这在城市场景中尤其明显,有几个相似的环境。然而,它需要从对象探测器中处理非高斯和离散的随机变量。先前的方法促进了全球本地化和数据关联的语义信息,以减少地标之间的实例歧义。但是,其中许多方法并不涉及创建对环境的完整全球一致表示,并且通常不能很好地扩展。我们利用多个假设树来通过位置,实例和类来得出语义测量的概率数据关联,以创建语义表示。我们提出了一种优化的映射方法,并利用姿势图来得出新颖的语义大满贯解决方案。此外,我们表明语义共价性图允许在城市环境中获得精确的位置识别。我们使用现实世界室外数据集验证我们的方法,并证明平均漂移减少了33%W.R.T.原始探测源。此外,与常规多个假设方法相比,我们的方法平均产生的假设少55%。

In this paper, we present a semantic mapping approach with multiple hypothesis tracking for data association. As semantic information has the potential to overcome ambiguity in measurements and place recognition, it forms an eminent modality for autonomous systems. This is particularly evident in urban scenarios with several similar looking surroundings. Nevertheless, it requires the handling of a non-Gaussian and discrete random variable coming from object detectors. Previous methods facilitate semantic information for global localization and data association to reduce the instance ambiguity between the landmarks. However, many of these approaches do not deal with the creation of complete globally consistent representations of the environment and typically do not scale well. We utilize multiple hypothesis trees to derive a probabilistic data association for semantic measurements by means of position, instance and class to create a semantic representation. We propose an optimized mapping method and make use of a pose graph to derive a novel semantic SLAM solution. Furthermore, we show that semantic covisibility graphs allow for a precise place recognition in urban environments. We verify our approach using real-world outdoor dataset and demonstrate an average drift reduction of 33 % w.r.t. the raw odometry source. Moreover, our approach produces 55 % less hypotheses on average than a regular multiple hypotheses approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源