Test Rigs

One of the main goal of the chair is algorithm development for decision making and control, yielding technological impact in the domain of robotics, automotive, production and power systems. As we do not develop our own hardware solutions, this necessitates the availablity of profound lab facilities, including testbeds, as well as computation facilities and sensing infrastructure. Starting with its constitution, the lab facilities of the chair have continuously advanced in applications of autonomous driving, robotics and power systems. These have been procured by funding resources of the federal ministries, which we are very thankful for.

An overview of the test rigs currently available at the institute reads as follows:

  • Autonomous driving vehicles (VW Passat and Tiguan)
  • AI/ML High-Performance Computation: Nvidia DGX2 and Supermicro HGX-A100-System
  • Car2X communication infrastructure
  • PowerHIL (single moduled 3AC/2DC phases, in short);
  • Collaborative robot arms (2 UR5 systems)
  • Ground Penetrating Radar (GSSI)
  • Omnidirectional mobile robot (based on Donkey system)
  • Humanoid robot (Nao)
  • Scaled Smart - Factory (Fischer-Technik)
  • Drones (self developed)

Automated Vehicles

The chair possesses two automated vehicles that have been assembled by IAV GmbH. The vehicles are equipped with extensive real-time computation and environment sensing resources, including interfaces supporting vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication. 

In addition to designing model-based and data-driven (i.e. AI) algorithms in the domains of control, decision making and computer vision, we target herewith demonstration of the autonomous driving functions in practice, thereof deriving challenges for next generation autonomous functions and algorithms.

The development of the vehicles has been funded by the federal ministries in the context of research projects Radspot (GZ 16AVF2166) and Aorta (GZ 01MM20002).

Weiterlesen.

Collaborative Robotic Arms

  • Two collaborative robotic arms UR5 from Universal Robots
    • Number of axis: 6 rotational axis
    • Working radius: 850 mm
    • Maximum payload: 5 kg
    • Control frequency: 125 Hz
  • Two pneumatic grippers for fruit grasping
  • Two pneumatic grippers for bottle grasping
  • Two universal 2-Finger Gripper from Robotiq
    • Gripper opening: 0-140 mm
    • Maximum payload: 2.5 kg
    • Gripping Force: 10-125 N
    • Closure speed: 30-250 mm/s

Omnidirectional Mobile Robot

  • Omnidirectional movements
  • Mecanum wheels
    • Roller arrangement: 45°
    • Nominal torque: 4 x 25 Nm
    • Maximum torque: 4 x 75 Nm
    • Break torque: 4 x 32 Nm
  • Maximum speed: 1.0 m/s
  • Maximum payload fa: 2750N
  • Sensors
    • Laserscanners 360° view
    • Stereocameras

Humanoid Robot NAO

  • 25 joints (2 head, 10 arm, 1 pelvis, 10 leg and 2 hand joints)
  • Dimensions (H x D x W): 573 x 275 x 311 mm
  • Weight 5.2 kg
  • Sensors
    • two cameras
    • two loud speakers
    • four microphones
    • two infrared sensors
    • two ultrasonic sensors
    • inertial unit (two gyrometers and one accelerometer)
    • eight force sensitive sensors
    • 36 position sensors
  • Programming: C++ / Python / .NET / Java / MatLab

High Performance Computation

The chair possesses two high-end NVIDIA solutions for high-performance computation:

  • DGX1 with 4 NVIDIA V100 32 GB vRAM
  • DGX2 with 4 NVIDIA A100 40 GB vRAM (Latest)

NVIDIA DGX Station™ A100 at a glance:

  • AI workgroup server delivering 2.5 petaFLOPS of performance that can be used for training, inference, and data analytics
  • World-class AI platform, leveraging NVIDIA® NVLink® for running parallel jobs and multiple users without impacting system performance
  • The world’s only workstation-style system with four fully interconnected NVIDIA A100 Tensor Core GPUs and up to 320 gigabytes (GB) of GPU memory

Ground Penetrating Radar (GPR)

The chair has gathered extensive  experience in applying Machnine Learning for d​​​​​amage detection in the road substrate. To this end, a GSSI sensor has been coupled to one of our automated vehicles, thereby utilizing high-precision localization and camera data to cross label with surface damage detection. In addition, AI based algorithms for high-accurate localization utilizing GPR radagrams, i.e. fingerprints. have been developed and succesffuly demonstrated. These works have been conducted in cooperation with GGD Geophysik GmbH.

  • GSSI SIR 30, dual channel GPR, 512 smaples per scan and antenna.
  • Max scan rate is 700 scans per second.
  • We operate it in "Realtime Data Output Mode" to fuse it with highly precise position information from our dGPS (scan limit is then 200 scans/second).
  • The used antennas are:
    • Model 42000S (2GHz Antenna, measurement depth is about 2,5m)
    • Model 50400S (400MHz Antenna, measurement depth is about 0,6m)

Skalierte Smart Factory 4.0 von Fischer Technik

Der Lehrstuhl besitzt 2 integrierte Exemplare. Die Fabrikumgebungen bestehen jeweils aus den Fabrikmodulen Ein- und Auslagerungsstation, Vakuum-Sauggreifer, Hochregallager, Multi-Bearbeitungsstation mit Brennofen, einer Sortierstrecke mit Farberkennung, einem Umweltsensor sowie einer schwenkbaren Kamera. Der integrierte Umweltsensor meldet Werte zu Temperatur, Luftfeuchtigkeit, Luftdruck und Luftqualität. Die Kamera sieht durch den vertikalen wie horizontalen Schwenkbereich die gesamte Anlage ein und ist so für eine webbasierte Fernüberwachung nutzbar. Die einzelnen Werkstücke werden durch NFC (Near Field Communication) getrackt. 

Die Steuerung der Lernfabrik 4.0 erfolgt durch Algorithmen die am RosberryPie und TXT Controller auf 9V-Basis implementiert sind. Diese sind innerhalb der Fabrik untereinander vernetzt und kommunizieren mittels MQTT. Die Software-Applikation wird über eine C/C++ Programmierschnittstelle implementiert.