Upon completion of the project, an updated interface description will be published, which will also be incorporated into the ongoing ISO process. For the first time, all automobile manufacturers and suppliers will be able to integrate their products quickly and easily into the fusion platform.
With the environment model, Hella Aglaia Mobile Vision has developed the central component of the OFP. Using the visualization options, developers can see how the vehicle perceives the entire environment and on this basis decide which sensor data is to be merged. Whether complex driver assistance functions or fully automatic driving functions - all functions can be programmed in this way.
Work on the OFP will continue after the project. Central questions will be how the sensor data can be processed with machine learning in order to improve the functions and further accelerate the development work. The car park scenario is also to be expanded to include urban driving situations or at speeds of over 20 km/h. The development work will also be accelerated. These scenarios require interaction with other sensors, e.g. lidar sensors. It is precisely in this area of multisensory data fusion that OFP is able to exploit its full potential. In addition, functional safety will play a major role in further development to ensure that all functions developed are fail-safe.
The OFP was developed by the network coordinator Hella KgaA together with the German Aerospace Center DLR, Elektrobit, Infineon Technologies, InnoSent, Hella Aglaia Mobile Vision, Reutlingen University, RWTH Aachen University, Streetscooter Research and TWT.