6D localization

Feature-Based Mobile Mapping with 3D Lidars on Embedded GPUs

This thesis presents two solutions to the Simultaneous Localization and Mapping (SLAM) problem that share a common core. Featsense uses lidar point cloud features for odometry estimation, while Warpsense presents a GPU-accelerated Point-to-TSDF scan matching algorithm that performs localization in a high resolution, continuous Truncated Signed Distance Field (TSDF) representation of the environment. Both methods share the same mapping backend, a highly GPU-optimized TSDF generation module that allows the generation of efficient triangle meshes in post-processing.

Einbindung verschiedener Sensorquellen in KinectFusion zur verbesserten Kartierung in Umgebungen ohne Landmarken

Object detection based AMCL in semantic 3D maps

Monocular Localization in Feature-Annotated 3D Polygon Maps

6DoF localization and mapping is becoming increasingly relevant in autonomous robotics. Especially in environments where GPS is not available (e.g. indoor), there are more and more localization methods based on camera images. The aim of this work is to develop a closed system that allows a robot to locate itself in a 3D map through a camera. The developed software is distinguished into offline mapping and online localization. The map consists of a polygon mesh, which is reconstructed from 3D laser scans of the environment.