Robot cars from different manufacturers will demonstrate their driving scenarios in the same arena at the RobustSense final event in Ulm, Germany, where Marilyn will head with six other robots on 16 May.
The latest additions to Marilyn include optical component wavelengths via the new 1550 nanometer lidar and additional intelligence for its software, which improves sensor capabilities. Furthermore, software modules have been built in for the filtering of point clouds and the assessment of scanner reliability. This is to ensure the vehicle's ability to function, including in fog and powdery snow, under which conditions the lidar radar, which ‘sees’ in the visible and near infrared ranges of the spectrum, even enables the robot car to see people better. Although Marilyn's vision is limited to roughly 30 meters in thick fog, the new lidar type allows the car to be driven slowly rather than having full stop as it was the case with earlier versions, comments Project Manager Matti Kutila of VTT's RobotCar Crew team.
The car also has traditional automotive radars and lidar, but their detection of non-metallic obstacles and resolution is limited, particularly when trying to recognize shapes. By combining the results from both radar and lidar sensors, the vehicle can benefit from each sensor’s advantages. “This makes the automatic vehicle safer than a car driven by a person – although there are still lot of obstacles in development path, a major leap has been taken in the right direction," Kutila says.
Nevertheless, the researchers still have a long way to go on the journey towards 24/7 automated driving”, Kutila concedes. “If we think of this as a 42km marathon, we are now perhaps 10km closer to our goal.”
More and more scenarios (such as city environments, main roads, snow, exit ramps), which robot cars can manage, have been added. The vehicle will present her special capabilities at the RobustSense event. Marilyn will drive through a bank of fog created