UW research could help users ‘train’ their smartphones

Mobile phones have become second-nature for most people. What’s coming next, say UW researchers, is the ability to interact with our devices not just with touchscreens, but through gestures in the space around the phone. Some smartphones are starting to incorporate 3-D gesture sensing based on cameras, for example, but cameras consume significant battery power and require a clear view of the user’s hands.

UW engineers have developed a new form of low-power wireless technology that could soon contribute to this growing field by letting users “train” their smartphones to recognize and respond to specific hand gestures. The technology—developed in the labs of Matt Reynolds and Shwetak Patel, UW associate professors of electrical engineering and of computer science and engineering—uses the phone’s wireless transmissions to sense nearby gestures, so it works when a device is out of sight in a pocket or bag and could easily be built into future smartphones and tablets.

“We have developed a new type of sensor that uses the reflection of the phone’s own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding the phone, looking at the display or touching the screen,” says Reynolds.

The system uses multiple antennas to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed. In this way, tapping, hovering and sliding gestures could correspond to various commands, such as silencing a ring, changing which song is playing or muting the speakerphone.