Logo des Repositoriums
  • English
  • Deutsch
Anmelden
Keine TU-ID? Klicken Sie hier für mehr Informationen.
  1. Startseite
  2. Publikationen
  3. Publikationen der Technischen Universität Darmstadt
  4. Erstveröffentlichungen
  5. Development of a Method for the Fusion of Environment Perception Sensor Data for an Automated Vehicle
 
  • Details
2021
Erstveröffentlichung
Masterarbeit
Verlagsversion

Development of a Method for the Fusion of Environment Perception Sensor Data for an Automated Vehicle

File(s)
Download
Hauptpublikation
Masterthesis_Marc_Uecker_tuprints.pdf
CC BY-NC-ND 4.0 International
Format: Adobe PDF
Size: 35.18 MB
TUDa URI
tuda/7062
URN
urn:nbn:de:tuda-tuprints-186134
DOI
10.26083/tuprints-00018613
Autor:innen
Uecker, Marc ORCID 0000-0003-2489-5841
Kurzbeschreibung (Abstract)

Autonomous driving is currently one of the most anticipated future technologies of the automotive world, and researchers from all over the world are dedicated to this task. In the same pursuit, the aDDa project at TU Darmstadt is a collaboration of researchers and students, focused on jointly engineering a car into a fully autonomous vehicle. As such, the aDDa research vehicle is outfitted with a wide array of sensors for environment perception.

Within the scope of the aDDa project, this thesis covers the fusion of data from LIDAR, RADAR and camera sensors into a unified environment model. Specifically, this work focuses on providing real-time environment perception, including fusion and interpretation of data from different sensors using only on-board hardware resources.

The developed method is a software pipeline, consisting of an analytical low-level sensor fusion stage, a 3D semantic segmentation model based on deep learning, and analytical clustering and tracking methods, as well as a proof-of-concept for estimating drivable space. This method is designed to maximize robustness, by minimizing the influence of the used machine learning approach on the reliability of obstacle detection. The sensor fusion pipeline runs in real-time with an output frequency of 10 Hz, and a pipeline delay of 120 to 190 milliseconds in the encountered situations on public roads.

An evaluation of several scenarios shows that the developed system can reliably detect a target vehicle in a variety of real-world situations.

The full contributions of this work not only include the development of a sensor fusion pipeline, but also methods for sensor calibration, as well as a novel method for generating training data for the used machine learning approach. In contrast to existing manual methods of data annotation, this work presents a scalable solution for annotating real-world sensor recordings to generate training data for 3D machine perception approaches for autonomous driving.

Freie Schlagworte

sensor fusion

environment perceptio...

machine perception

machine learning

object detection

semantic segmentation...

pointclouds

point cloud

pointcloud

deep learning

computer vision

autonomous driving

LIDAR

RADAR

camera

calibration

sensor calibration

camera calibration

motion compensation

data annotation

training data

dataset

labeling

Sprache
Englisch
Alternativtitel
Entwicklung einer Methode zur Datenfusion von Umfelderfassungssensoren für ein Automatisiertes Fahrzeug
Fachbereich/-gebiet
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
DDC
000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
600 Technik, Medizin, angewandte Wissenschaften > 620 Ingenieurwissenschaften und Maschinenbau
Institution
Technische Universität Darmstadt
Ort
Darmstadt
Gutachter:innen
Kuijper, ArjanORCID 0000-0002-6413-0061
Winner, HermannORCID 0000-0002-9824-3195
Linnhoff, ClemensORCID 0000-0001-7571-0734
Name der Gradverleihenden Institution
Technische Universität Darmstadt
Ort der Gradverleihenden Institution
Darmstadt
PPN
48123098X

  • TUprints Leitlinien
  • Cookie-Einstellungen
  • Impressum
  • Datenschutzbestimmungen
  • Webseitenanalyse
Diese Webseite wird von der Universitäts- und Landesbibliothek Darmstadt (ULB) betrieben.