The Computer Science & Engineering Department at the University of Washington has made a name for itself as an innovator and is further cementing that reputation with its latest development, AllSee, which allows users to control devices with hand gestures, even when the devices are out of sight.
Say you want to mute the song that’s playing on your smartphone. Simply flick your finger in the air and it’s done — even with the phone hidden away in your pocket. AllSee uses wireless signals to detect motion, then translates changes in the signals to create a specific command. It also harvests existing TV signals as a power source, eliminating the need for a battery. Consisting of nothing more than a wire and a processor, it can be assembled for less than a dollar.
The entire project came together in less than a year. Vamsi Talla, who created the AllSee prototype with fellow graduate student Bryce Kellogg, says, “It has a lot of potential in mobile phones, especially issuing commands without a camera. Moving forward, the main reason AllSee is attractive is because of its low power consumption.” Existing gesture-recognition systems require a lot of power and computational resources.
AllSee is already garnering attention from the tech world. Assistant Professor Shyam Gollakota says the technology has many potential applications that attract would-be investors, but mainly it’s the gesture-recognition technology that doesn’t need a camera or direct line of sight to operate. “Your phone can be in your pocket or in your bag and you can actually interact with it,” Gollakota says.
While the team hasn’t aimed its sights at the likes of Microsoft’s Kinect technology, it’s not hard to imagine AllSee being taken to levels that may rival the camera-dependent Kinect. Smartphone technology may be the first step, since the implications for consumer household electronics — controlling the thermostat and appliances, for instance — are enormous.
To watch a video about AllSee, please visit AllSee.cs.washington.edu.