Know your occupants – know their needs
While everybody gets in, settings are adjusted to the diver’s preferences, and internet access vis Bluetooth is prepared for the occupants’ tablets or smartphones (“Bring Your Own Device”, BYOD technology). During an initial period of manual driving a Digital Companion supports the driver with navigation hints or warnings generated in an environment model which in turn is based on sensor signals and external information (e.g. V2X). In this phase the Infotainment/Interaction and Vehicle servers and the provision of Software Services (apps) need to interact seamlessly. Once on the motorway, the driver activates the Automated Driving (AD) function. If the kids in the rear seats get bored, the AI-based algorithms of the Digital Companion will recognize the type of occupant to accommodate for his/her special needs. So, depending, e.g., on the occupant age or emotional state, suitable entertainment suggestions or support are provided by an avatar on the rear-seat entertainment screen. This kind of support relieves the parents while driving and keeps the family relaxed. Plain everyone who has ever been on that kind of trip with their kids, will know the situation and its potential implications…
Towards the end of the AD section, the driver needs to be brought back into the loop to take back control over the car. To optimize this important transition, the driver’s level of attention is detected through camera-based driver monitoring (FIG. 5). However, other parameters such as medical details can also be included to ascertain that the driver is capable of taking back the driving task (i.e. drowsiness detection).
Depending on the result of this interaction between the car and the driver, the transition process is adapted to ensure a smooth back delegation. If that fails for some reason (e.g. a medical emergency), a minimum risk maneuver will be carried out to bring the car to a safe state.
Assuming that the back delegation went smoothly, the navigation system now begins to provide the turn-by-turn information to get the car to the vehicle exit zone of the amusement park. While the family leaves the vehicle and enters the park, the car parks and locks itself autonomously and may use the parking time for installing OTA updates and preparing the return journey.
This scenario reveals the level of data exchange and connectivity within the car and between the car and the internet/the cloud. Every server in the car (AD with environment model, Body Electronics, Infotainment/Interaction with occupant model) needs to communicate data with the other servers to provide a smooth and safe ride. Permanent access to the cloud and to software services is channeled through a central gateway that provides access and security. This can be done perfectly well within an in-vehicle server-based architecture but causes problems in a decentralized architecture with many dedicated ECUs.
Future cars will be Always On because drivers will expect to have information, apps and services at their fingertips wherever they go – inclusive of using the car. A connected vehicle that can synchronize with other smart devices will create a seamless experience of comfort and safety. Plus, future cars will have to meet extremely ambitious efficiency levels, will have to emit next to no emissions, and shall avoid accidents on the way to Vision Zero. This requirement mix necessitates a new approach to the E/E architecture which is seamlessly connected to the cloud. New functions in the vehicle – spread over practically all domains – will make use of vehicle-internal networking, and connecting the vehicle to the cloud to make services and apps accessible.