Robots: Humans' Dependable Helpers

Technologies in Robotic Systems

Augmented Reality

Augmented Reality for Visualization, Interaction and Simulation in Robotic Applications

The Fraunhofer IFF’s Robotic Systems Business Unit develops cross-project tools and software components for the creation of applications with augmented reality. The advantages and basic functions of our own software architecture are applied to distributed robotic applications and the countless software modules produce with it, including not only our own marker system but also modules for classic functions of mobile robots such as localization and data fusion. They are also suitable for visualization that requires tracking users or a camera.

Live and Offline Visualization Sensor Data in the Context of Real Environments

The system can be used to produce applications that represent and present equipment and systems and to support robot programming by end users or for demonstration and training purposes. It supports developers in their work and the creation of new and innovative robots and the programming of applications. Local embedding of sensor data from such sources as laser scanners, 3D sensors or even cameras in a real scene facilitates rapid and intuitive diagnosis of problems like calibration errors.

AR CAVE: An HRI Simulation Environment

The AR CAVE is a space equipped with sensors that scan object and people in it, three-dimensionally in real time. This installation efficiently simulates situations with human-robot interaction for planning, analysis and demonstration without any machine components endangering experimenting individuals. To do so, real and computer generated objects such as robots or machines are combined by continuously scanning of real objects in 3D, inserting virtual robots and simulating the physical interaction between both worlds.

Data is output visually with the aid of augmented reality using selectable stationary cameras and novel, compact AR glasses with integrated stereo cameras, which are worn by interacting users in the space.

The system can be varied to include real and virtual robots; real robot controllers in real environments and virtual robots in simulation environments or simulated and real sensor systems. Use scenarios cover situation adjustment and analysis as well as training and education. The system is currently in development and as one of the priorities of the ViERforES research project.