Located on our university campus, iMotion Germany GmbH is an advanced engineering think tank for iMotion in Suzhou, China. The 1st Tier supplier provides hardware and software for ADAS and autonomous driving. ASR collaborates with iMotion by supervising PhD candidates who are on the companies payroll.
OpenDS is one of the most popular open-source driving simulators world-wide. Originally developed to facilitate studies in the driver distraction domain, OpenDS became a tool for investigating autonomous driving technology. The latest OpenDS release is OpenDS 5.0 published in 2018. http://opends.dfki.de
Autonomous driving (highly / fully automated driving) is one out of four applications domains we develop AI technology for. With regards to environment perception, we apply our Digital Reality principle i.e. we develop AI that is capable of generating synthetic data, focussing on human behavior in urban traffic situations. For trajectory planning, we apply our hybrid learning technology leading to more robust and trustworthy systems. Moreover, our work on the AI platform will be applied in the autonomous driving domain.
Industrie4.0 will deliver new, digitally refined, intelligent products. Customer will be able to design their own products and have them produced at a reasonable cost. Products and production will be more versatile. Products will be designed by globally distributed design teams and produced in smart factories.
The team Autonomous Driving (AD) team conducts research on AI-based environment perception and trajectory planning for autonomous vehicles (Level 4 >).
We consider both subsymbolic AI-techniques used in learning systems (machine learning, deep learning) as well as symbolic techniques such as reasoning or constraint-based methods.
With respect to learning systems, we work with both real and synthetic data in an idealized implementation of the Digital Reality Principle, which is the thematic guideline of the research area ASR.
The overall goal of REACT is a systematic, safe and validatable approach to developing, training and use of digital reality with the goal to ensure safe and reliable acting autonomous systems – especially in critical situations. In order to reach this goal, we use methods and concepts of machine learning – especially
Deep Learning and (Deep) Reinforcement Learning (RL) – to learn lower-dimensional submodels of the real world. From these submodels we compile (semi) automatically complex, high-dimensional models in order to identify and simulate the entire range of critical situations. By means of digital reality, we virtually synthesize the otherwise missing sensor data of critical situations and train autonomous systems so that they are able to handle critical situations safe and confident. The aim of the project is to enhance the capabilities of autonomous systems. Therefore we continuously and systematically validate and align synthetic data with reality and adapt the models where necessary.
Since March 2016 the research department of Agents an Simulated Reality together with the Innovative Retail Lab and partners from other research institutions and from industry is developing intelligent added value services in the field of building automation.
Fastlane solves these problems by combining the results of two successful projects, Xflow and shade.js, to provide a compiler-driven adaptive data-flow-programming framework for parallel data processing on the web.
A short video introducing the Guided Autonomous Building project results.