Our hands are fast and precise instruments, but so far, we haven’t been able to capture their sensitivity and accuracy in user interfaces. However, there’s a natural vocabulary of hand movements we’ve learned from using familiar tools like smartphones, and Project Soli aims to use these motions to control other devices. For example, your hand could become a virtual dial to control volume on a speaker, or a virtual touchpad to browse a map on a smartwatch screen.

To make our hands self-contained interface controls, the team needed a sensor that could capture submillimeter motions of overlapping fingers in 3D space. Radar fits all these requirements, but the necessary equipment was just a little…big.

So the Project Soli team created a gesture radar small enough to fit in a wearable device. It’s a new category of interaction sensor, running at 60GHz; one that can capture motions of your fingers at resolutions and speeds that haven’t been possible before—up to 10,000 frames per second. To get there, the team had to reinterpret traditional radar, which bounces a signal from an object and provides a single return ping. From a hardware and computation perspective, this would have been challenging to recreate on a small scale. So to capture the complexity of hand movements at close range, Soli illuminates the whole hand with a broad radar beam, and estimates the hand configuration by analyzing changes in the returned signal over time.

The team built the first prototype, a 5x5mm piece of silicon, in just 10 months. They’re working on finalizing the development board (prototype) and software API for release to developers later this year. Watch Ivan Poupyrev, Technical Project Lead, talk about the project at I/O here: https://www.youtube.com/watch?v=mpbWQbkl8_g&feature=youtu.be&t=10m15s

You can reach out to the team at projectsoli@google.com if you’d like to get involved. Watch the video to meet the team and learn more.  #ATAP #ProjectSoli 
Shared publiclyView activity