Placement of the sensors needs to be chosen such that there is a difference in the order in which sensors are triggered when the hand makes a gesture over the sensor plane. We identify the order in which sensors are triggered by hand movements. If the order matches any of the preset sequences, then the corresponding gesture is issued. We will use the sensor placement pattern shown in Figure 1 as a reference for explaining the gestures discussed in this article.
Consider a simple gesture of a hand drawing a straight line in air by moving from left to right over the sensors as shown in figure 2(a). When hand moves from left to right over the sensors, the left sensor will be triggered first as soon as the hand approaches the system. Here the term ‘triggered’ is used to mean that the sensor has detected an object in its presence; this is not to be mistaken for enabling the proximity sensor. The proximity sensors are enabled as soon as the system is turned on and they keep scanning for objects in their proximity.
(b) Plot of signal for each of the sensors as hand draws the straight line gesture
As the hand continues to pass over the console, the top and bottom sensors are triggered while the left sensor still remains triggered. As the hand moves further towards the right sensor, the right sensor is triggered. The left sensor stops sensing the hand as the hand has moved outside its region of detection. As the hand passes over the right sensor, the top and bottom sensors will no longer detect the hand’s presence. When the hand moves further away, the right sensor too stop sensing the hand. If we look at the order of triggering of sensors,