Figure 3. Proximity gesture of hand drawing a straight line in vertical direction to scroll through menu
The gestures (left → right) and (right → left) can be associated with changing a track or album to the next one for a music player application. The same gestures can also be used instead of button presses to turn on or turn off the interior lights of a car by placing proximity sensors as shown in figure 4.
Figure 4. Hand drawing a straight line gesture to control the cabin lights of a car
A gesture of (top → bottom) is similar to the up/down action of a button press. However, when an up/down button is held pressed, the screen keeps scrolling up/down as long as the button is held pressed. In other words, the actions ‘sticks’ as long as the button is pressed. To replace this button action completely with a gesture, the gesture needs to be able to support this ‘sticky’ feature too. We will modify the gesture as described below to accommodate this. When the hand moves from the top sensor and down towards the bottom sensor, the system decodes this as a (top → bottom) gesture as soon as the hand moves past the bottom sensor.
We can modify the gesture so that the scroll down command is sent as soon as the hand reaches the last sensor in the gesture sequence; in this case, the bottom sensor. Furthermore, the command is issued repeatedly as long as the hand remains present over the bottom sensor. When the menu item required is reached, the hand moves further down and away from the bottom sensor and the issuing of scroll down command is stopped. So to make a ‘sticky’ gesture,