Proximity Gesture Applications In Automotive HMI

Proximity Gesture Applications In Automotive HMI
Technology News |
In this article Cypress Semiconductor look at the use of capacitive proximity sensors in automotive applications. These sensors can be used for simple actions such as keyless entry or turning on cabin lights and sensing an individual in the vehicle. However in this article Cypress focus on how you can now use them as inputs into systems such as media players and navigation systems with vehicles by using the same principles.
By eeNews Europe


Capacitive proximity sensors are generally used to detect the presence of a user within proximity of the sensors. Upon detection, we can choose to make backlights glow to bring focus on a specific button or bring a system out of low power operation after having sensed the user’s presence. Specifically in automotive applications, capacitive proximity sensors are used to sense a user and turn on the cabin lights or activate the keyless door unlocking system. In addition to sensing the presence of a user near the sensor, we can use multiple proximity sensors placed suitably to recognize simple hand gestures in the air. The data from all sensors can be combined together to map movement of a users hand in the proximity area of sensors. These gestures can be used as a way to provide inputs to a system such as to control the media player, navigate a map, or browse a playlist.

We can place multiple proximity sensors in a suitable pattern spatially apart from one other. As a hand moves across the sensors, the time instants at which it is detected by each of the sensors will be different. The relative order of detection of the hand and the time duration between detection by different sensors can be used to estimate the direction and pace of movement of hand. Gestures can be as simple as drawing a straight line in the air by moving the hand from left to right over the sensors, or a complex one involving drawing patterns such as a circle in air. In this article, we will see how to implement simple gesture recognition and how more complicated gestures can be implemented using multiple sensors in different patterns.

Consider four capacitive proximity sensors arranged as shown in figure 1 around the infotainment system of a car.

Figure 1. Capacitive proximity sensors placed around infotainment system on the right picture and the sensors with their position labeled on the left picture

Placement of the sensors needs to be chosen such that there is a difference in the order in which sensors are triggered when the hand makes a gesture over the sensor plane. We identify the order in which sensors are triggered by hand movements. If the order matches any of the preset sequences, then the corresponding gesture is issued. We will use the sensor placement pattern shown in Figure 1 as a reference for explaining the gestures discussed in this article.

Consider a simple gesture of a hand drawing a straight line in air by moving from left to right over the sensors as shown in figure 2(a). When hand moves from left to right over the sensors, the left sensor will be triggered first as soon as the hand approaches the system. Here the term ‘triggered’ is used to mean that the sensor has detected an object in its presence; this is not to be mistaken for enabling the proximity sensor. The proximity sensors are enabled as soon as the system is turned on and they keep scanning for objects in their proximity.


Figure 2. (a) Left to right hand movement gesture drawing a straight line in air
(b) Plot of signal for each of the sensors as hand draws the straight line gesture

As the hand continues to pass over the console, the top and bottom sensors are triggered while the left sensor still remains triggered. As the hand moves further towards the right sensor, the right sensor is triggered. The left sensor stops sensing the hand as the hand has moved outside its region of detection. As the hand passes over the right sensor, the top and bottom sensors will no longer detect the hand’s presence. When the hand moves further away, the right sensor too stop sensing the hand. If we look at the order of triggering of sensors, it will be one of the below, depending upon the position of the hand and sensitivities of the individual sensors:

Left → top → bottom → right
Left → bottom → top → right
Left → bottom → right
Left → top → right

All of above sensor activation sequences are mapped to the (left → right) gesture. A PSoC is used in this case for implementing the capacitive proximity sensors. A Capacitance to Digital converter (known as Capsense Sigma Delta) inside the PSoC is used to measure the capacitance. The output of the CSD module is referred to as rawcounts. The higher the rawcounts, the greater the capacitance sensed by the sensor. The presence of a hand close to the proximity sensors increases their capacitance.

When rawcounts of the sensor crosses a certain threshold from its base value, we say the sensor is triggered due to presence of an object in its proximity. The rawcounts plot of the four sensors as a hand draws a straight line from left to right as shown in Figure 2 (a) is shown in Figure 2 (b). The plot confirms the order of activation of sensors mentioned above. If hand moves in opposite direction, that is for a (right → left) gesture, the sequence in which sensors are triggered is reversed with respect to the left and right sensors in the above mentioned sensor activation sequences. That is, the sensor triggering sequence will be one of those below for a (right → left) gesture:

Right → top → bottom → left
Right → bottom → top → left
Right → bottom → left
Right → top → left

The above two gestures mentioned involve movement of the hand in the horizontal direction. Similarly, if the hand draws a straight line in the vertical direction, then it can be either a (top→bottom) gesture or a (bottom→top) gesture, depending on direction of hand movement.

The gestures (up → down) and (down→ up) can be associated with simple actions like scrolling up, down the menu or a track list for example as shown in Figure 3.

Figure 3. Proximity gesture of hand drawing a straight line in vertical direction to scroll through menu

The gestures (left → right) and (right → left) can be associated with changing a track or album to the next one for a music player application. The same gestures can also be used instead of button presses to turn on or turn off the interior lights of a car by placing proximity sensors as shown in figure 4.

Figure 4. Hand drawing a straight line gesture to control the cabin lights of a car

A gesture of (top → bottom) is similar to the up/down action of a button press. However, when an up/down button is held pressed, the screen keeps scrolling up/down as long as the button is held pressed. In other words, the actions ‘sticks’ as long as the button is pressed. To replace this button action completely with a gesture, the gesture needs to be able to support this ‘sticky’ feature too. We will modify the gesture as described below to accommodate this. When the hand moves from the top sensor and down towards the bottom sensor, the system decodes this as a (top → bottom) gesture as soon as the hand moves past the bottom sensor.

We can modify the gesture so that the scroll down command is sent as soon as the hand reaches the last sensor in the gesture sequence; in this case, the bottom sensor. Furthermore, the command is issued repeatedly as long as the hand remains present over the bottom sensor. When the menu item required is reached, the hand moves further down and away from the bottom sensor and the issuing of scroll down command is stopped. So to make a ‘sticky’ gesture, instead of moving the hand away from the system in one go, we stop the hand on the last sensor just before moving out of the range of that senor. The command is issued as long as the hand stays over the sensor.

A rawcounts plot for top and bottom sensors for this sticky (top → bottom) gesture is shown in Figure 5. The bottom sensor continues to stay triggered for more time after the top sensor stops sensing the hand. This indicates that the hand has stopped at the bottom sensor instead of continuing straight down. To issue the sticky command, we check if the top sensor was triggered first, followed by triggering of the bottom sensor. Then the top sensor no longer senses a hand while the bottom sensor still continues to sense a hand near it. After the hand stays near the bottom sensor for more than a threshold of time, the sticky command is issued as long as the bottom sensor senses the hand near it. Similarly, other gestures too can be modified to have ‘sticky’ feature. This enables gestures to replace the up/down button functions completely.

Figure 5. Signal plot for top and bottom sensors for sticky (top → bottom) gesture. The signal on the bottom sensor stays on, longer indicating a ‘sticky’ gesture

The hand can start over any of the sensors and then traverse in a circular pattern in either a clockwise or anticlockwise direction over the other sensors. The loop is completed when the hand reaches the sensor on which it started then exits the circular loop gesture by moving away. For example, the hand can move over the right sensor and then move clockwise over the bottom, left and top sensors in that order before exiting the loop over the right sensor again. Similarly, a counterclockwise loop can be completed by reversing the direction of movement of the hand. Also, multiple rotations can be counted from the sensor excitation order.

The circle gesture is similar to the action of ‘turning a knob’. This can be associated with commands like volume up and down for the music player menu or zoom in and zoom out for browsing maps.

In this article, we have discussed the detection of simple proximity hand gestures using capacitive proximity sensors. Using these same principles, we can build more complex gestures which may involve using both hands to draw a pattern in air. The success of detection of such gestures, however, still depends on how good a sensor pattern we choose. It is important to choose a suitable pattern which allows for tolerance in hand movements while drawing gestures and yet have a clear distinction in the order in which the sensors are triggered.

About the author:
Sivaguru Noopuran ( is a Senior Product Marketing Engineer for automotive products at Cypress Semiconductor Corp., with more than seven years of experience. His primary interests include user interface solutions and product management.

Linked Articles
eeNews Automotive