The Google Team that Wants to Make Your Hand a Self-Contained Interface Control Using Radar
Google’s Project Soli is developing a new interaction sensor using radar technology. The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects. The team adds:
“Our hands are fast and precise instruments, but so far, we haven’t been able to capture their sensitivity and accuracy in user interfaces. However, there’s a natural vocabulary of hand movements we’ve learned from using familiar tools like smartphones, and Project Soli aims to use these motions to control other devices. For example, your hand could become a virtual dial to control volume on a speaker, or a virtual touchpad to browse a map on a smartwatch screen.
To make our hands self-contained interface controls, the team needed a sensor that could capture submillimeter motions of overlapping fingers in 3D space. Radar fits all these requirements, but the necessary equipment was just a little…big.
So the Project Soli team created a gesture radar small enough to fit in a wearable device. It’s a new category of interaction sensor, running at 60GHz; one that can capture motions of your fingers at resolutions and speeds that haven’t been possible before—up to 10,000 frames per second. To get there, the team had to reinterpret traditional radar, which bounces a signal from an object and provides a single return ping. From a hardware and computation perspective, this would have been challenging to recreate on a small scale. So to capture the complexity of hand movements at close range, Soli illuminates the whole hand with a broad radar beam, and estimates the hand configuration by analyzing changes in the returned signal over time.
The team built the first prototype, a 5x5mm piece of silicon, in just 10 months. They’re working on finalizing the development board (prototype) and software API for release to developers later this year”