Purchasing new hardware? Read our latest product comparisons

AllSee prototype puts gesture recognition in your pocket


March 2, 2014

The AllSee prototype developed at the University of Washington allows gestures to be detected while the phone is out of sight (Photo: University of Washington)

The AllSee prototype developed at the University of Washington allows gestures to be detected while the phone is out of sight (Photo: University of Washington)

Image Gallery (3 images)

Current gesture recognition technology seen in devices such as Samsung's Galaxy S4 generally rely on the device's camera. This not only creates a drain on the device's battery, but means users need to retrieve the phone from their pocket or handbag to make use of the technology. The new AllSee system developed at the University of Washington (UW) overcomes both these problems by using wireless signals not only as a power source, but also to detect user gestures when the phone is tucked away out of sight.

The AllSee system uses an ultra-low-power receiver that is powered by ambient electromagnetic waves from wireless transmissions such as TV broadcasts and detects the changes in the amplitude of these waves caused by the user's movements. Different hand gestures affect the amplitude of the waves reflected off the human body in different ways, forming unique signatures that the system can detect and translate into specific commands.

"This is the first gesture recognition system that can be implemented for less than a dollar and doesn’t require a battery," says Shyam Gollakota, a UW assistant professor of computer science and engineering. "You can leverage TV signals both as a source of power and as a source of gesture recognition."

The UW team has built an AllSee prototype and tested it by attaching it to the back of a smartphone. They claim the prototype could correctly identify the intended hand gestures performed more than 2 ft (60 cm) away from the sensor over 90 percent of the time. These gestures include pushing and pulling to zoom in and out and raising and lowering the hand to alter the volume of music playback. Additionally, the sensor's response time came in at under 80 microseconds, or 1,000 times faster than the blink of an eye.

The low power requirements of the system provide the possibility of the technology being always on without draining the battery of the mobile device. The UW team designed a wake-up gesture to ensure unintentional gestures don't confuse the system.

As well as obvious applications in mobile devices, the UW researchers say the AllSee system could also find a place in household electronics, such as home monitoring systems.

"Beyond mobile devices, AllSee can enable interaction with Internet of Things devices," says Bryce Kellogg, a UW doctoral student in electrical engineering. "These sensing devices are increasingly smaller electronics that can’t operate with usual keypads, so gesture-based systems are ideal."

The UW team will showcase the AllSee system at the Symposium on Networked Systems Design and Implementation conference being held in Seattle in April.

The AllSee prototype can be seen in action in the following video.

Source: University of Washington

About the Author
Darren Quick Darren's love of technology started in primary school with a Nintendo Game & Watch Donkey Kong (still functioning) and a Commodore VIC 20 computer (not still functioning). In high school he upgraded to a 286 PC, and he's been following Moore's law ever since. This love of technology continued through a number of university courses and crappy jobs until 2008, when his interests found a home at Gizmag. All articles by Darren Quick
Post a Comment

Login with your Gizmag account:

Related Articles
Looking for something? Search our articles