• Zielgruppen
  • Suche
 

Research

The Real Time Systems Group focuses on the following areas: planning of complex technical systems, modelling and analysis of event-discrete systems with formal methods, software development methods and devices in the automation technology as well as in the programming and testing of embedded, networked control devices under the aspect of real time, reliability and security.

The respective applications range from systems automation with industrial, programmable logic controllers to event-discrete (reactive) control of autonomous, mobile robots with embedded microcontrols and real time operating systems.

Currently, the RTS research projects can be assigned to the following areas:

Projects

SmokeBot

USBV-Inspektor

Researcher: Dipl.-Ing. Christian Wieghardt, M. Sc. Sebastian Kleinschmidt

As part of the USBV-Inspektor project, L3S develops a multimodal sensor-suite together with the North Rhine-Westphalia State Office of Criminal Investigation, Fraunhofer FHR, ELP GmbH and Hentschel System GmbH. This sensor-suite consists of a millimetre wave scanner, a 3D rangescanner and a high-resolution camera. The system perceives the internal and external geometry and generates high definition images of suspicious objects to support the local action force and collect additional evidence.

Motivation

Suitcases, bags or backpacks left lying around unsupervised are part of daily life. Even most abandoned luggage items turn out to be harmless, they can be the cause for a large-scale police operation. The objective of the USBV-Inspektor project is to develop a sensor-suite, which can be mounted on the end effector of a remotely operated robot to enable local forces to inspect suspicious objects without placing themselves in danger. The operator remotely controls the robot from a save distance to get a 3D visualization of the environment. Furthermore, the data of the sensor-suite can be used to preserve evidence for criminal proceedings and to facilitate threat assessment of suspicious items by providing additional 3D information of the crime scene and the content of the object.

Mobile Service Robots

Autonomous forklift – a Revolution on the Factory Floor

The Realtime Systems Group (RTS) of the Leibniz Universität Hannover cooperates with the STILL GmbH in order to facilitate flexible navigation and materials handling.

An overriding aim of the project is to reduce costs by using flexible automated materials handling vehicles – both in warehouse and production facilities. Economic gains can be expected, for example, by considerably reducing the risk of accidentally dropping material or misplaced goods. Furthermore, little or no infrastructure changes (such as track guides) are necessary. The possibilities for an autonomous forklift truck are enormous, since the transition between manual and automated operation can be optimised. Futhermore, the transport processes can be reproduced – even those in which a high degree of care is needed when positioning.

Navigation of Autonomous Systems in Outdoor Environments

Service robots autonomously mow greens and pull out weeds in large parks or sports facilities. Tourists are most comfortably brought to the respective places of interest or accompanied on a fair. Disabled persons and elderly people find help in dealing with everyday routines. There are countless examples like the aforementioned ones, when service robots support individuals. To be able to solve the given tasks effciently the robot always has to know his present location thus constantly looking for the answer to the question: "Where am I?".

Hanna Langeoog.jpgTo find the answer to this question we devolop new procedures and mathematical models for navigating autonomous systems. In this special case, we focus on the navigation in unknown, natural surroundings. In contrast to indoor navigation, where the environment is structured by straight walls, doors and flat floors, the obstacles and objects available in outdoor environments to determine the navigation are much more complex. Form and surface of the objects (trees, bushes, buildings etc.) are irregular and change over time. Influences such as wind can hardly be described using unambiguous mathematical models. By fusing different sensors such as GPS, compass, laser scanner, wheel sensors and inertial circles we suceed in autonomously generating environment models of the working area. By using these maps localization is possible with a tolerance of only a few centimetres.

Robotic FireFighters

Recent disasters, such as the Fukushima catastrophe, the attack on the WTC, the Gulf of Mexico oil spill, or the Mont Blanc Tunnel fire, have shown that post disaster management tasks still require considerable human intervention, even though the humans entering affected areas risk their health and lives, causing human tragedy and immense cost for national economies. RFF.pngWith view to keeping these dangers and costs at bay, and to increasing effectiveness of disaster management operations, our long-term vision is that teams of autonomous robots equipped with sensors, manipulators, and communication capabilities will be able to enter dangerous, polluted or contaminated areas and to manage all the necessary tasks (e.g. search for survivors, evacuate injured persons, remove safety critical materials, clean up) without the need for explicit human assistance. This inspires the notion of Robotic FireFighters (RFF), in analogy to human fire brigades, where firemen (and firewomen) work together towards the common goal of getting a disaster under control.

3D-Laserscanner

To obtain 3D environmental data the Institute for Systems Engineering has developed a continuously rotating 3D scanner consisting of a Scan Drive system especially designed for this application and a SICK laser scanner. The sensor can be used on mobile robot platforms for environmental perception and navigation. It can take a complete picture of its surroundings every 2 seconds with more than 13,000 measuring points to be taken into account every second. These points are synchronized with the angle of the rotation 2D scanner and the actual position and location of the robot to generate accurate 3D data even during the robot's journey. Evaluation of the data obtained is performed via a Scalable Processing Box (SPB) on the basis of our robot operating system. Using this real time operating system it is possible to avoid systematic measurement errors which usually occur due to runtime differences during data processing.

Scandrive Web klein.jpgThe 3D point clouds obtained are evaluated using different algorithms. In this context, algorithms are available which recognize the obstacles in 3D pictures reducing the gathered information about size and orientation of the objects to a twodimensional view. Then, these data can be fed to the 2D SLAM algorithms thus enabling us to use the considerably higher amount of information contained in 3D scans to have 2D algorithms work more effectively. Integrating 3D perception makes it possible - for the first time - to autonomously navigate in an unstructured outdoor environment.

Furthermore we work on extracting characteristics within the 3D point clouds to finally examine the 3D data as to certain features, thus reducing the large quantities of data generated by the 3D sensor and implementing more efficient algorithms for object detection and classification.

Distributed Realtime and Automation Systems

Realtime-Linux: Xenomai, RTnet, RACK

Our institute's research and developement is based on Linux Realtime Extension. Xenomai Further development of this Open-Source-Projekt is actively supported. In this context, RTS contributed offering the Real-Time-Driver Model (RTDM) as well as numerous driver units.

Due to several student/diploma papers the project RTnet has ocurred at RTS aiming at using Standard-Ethernet for communcation in hard real time conditions. Already a short time after publication the research team has been joined by numerous volunteers coming from different countries so that progress is coming at a dynamic pace. The open architecture of RTnet offers the potential for diverse application purposes ranging from the institute's mobile robots to industrial units. RTS will continue to actively support the coordination and development of this project. RTnet.gif

All RTS Service robots are running an in-house developed real time middleware based on Xenomai und RTnet. In this context, Robotics Application Constructions Kit (RACK) represents the key system published as Open Source. Other components will be developed as a result of research projects or else according to the orders placed by industry partners.

Scalable Processing Box

Increasingly complex robotic projects in research and teaching require the development of a standardized and scalable processing platform which can be used not only as basis for lectures and internships but also for research projects. Spb.jpgThe Scalable Processing Box (SPB) is designed to offer a flexible possibility for students and reseachers to work in the area of robotics without having to worry about basic process infrastructure which they only wanted to use as a tool for their work. Object is, to standardize SPB with regard to hard- and software so that despite of using processor cores with different performance and architecture the same Lock-and-Feel can always be achieved. Thus, it should be possible to realize simple low level projects for student beginners as well as complex applications for research projects (e.g. doctor theses).