Upon quality inspection of production parts, the non-contact gesture-detection process provides the technician with visual feedback through a monitor that displays a 3D reconstruction of the car part. This system was developed by researchers at the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation IOSB in Karlsruhe, on behalf of the BMW Group.
“Previously, the inspector had to note all defects that were detected, leave his workstation, go to the PC terminal, operate multiple input screens and then label the position of the defect and the defect type. That approach is laborious, time-intensive and prone to error,” asserts Alexander Schick, scientist at IOSB. The gesture control system, by contrast, improves the inspector's working conditions considerably and creates substantial time savings. “If the bumper is fine, then he swipes over it from left to right. In the event of damage, he points to the location of the defect,” says Schick.
This non-contact gesture-detection system is based on 3D data. Hence, the entire workstation must first be reconstructed in 3D. That includes the individual as well as the object with which he is working. “What does the inspector look like? Where is he situated? How does he move? What is he doing? Where is the object? – all of these data are required so that the pointing gesture can properly link to the bumper,” explains the researcher. In order to enable gesture control, the experts apply 3D-body tracking, which records the individual’s posture in real time.
Even the car body parts are “tracked.” When it comes to this, the hardware requirements are minimal: A standard PC and two Microsoft Kinect systems – consisting of camera and 3D sensors – suffice in order to realize the reconstruction. Schick and his team developed the corresponding algorithms, which fuse multiple 2D and 3D images together, specifically for this kind of application, and adapted them to the standards of the BMW Group.
“The breeding ground