Back in May 2015, Google had plenty to say about one of its most exciting new programs – a motion-tracking unit that didn’t rely on a camera, but a sensor that emitted radiowaves. Project Soli, as it was known, showed bags of potential applications but news on the project soon dried up, leading many to assume Google had quietly shelved the tech for good.
- Check your blood glucose levels without having to prick your finger
However, it turns out Project Soli is far from shelved – in fact, the technology involved is set to make its way into many of Google’s future programs and creations. The news comes not from another official announcement, but a series of job postings seeking software and hardware specialists to help work on in-development projects that include Soli.
These posts include a Project Soli software architect, with the listing stating that the tech has, “attracted significant world-wide attention and interest from commercial partners.” Google is also seeking a hardware engineer with noted responsibilities including, “integrating Soli sensor into development platforms and proof-of-concept products.”
So how does Soli work? The technology uses radiowaves, much like we do for things like radar, but instead of tracking planes or ships it’s slimmed down to a small form factor that tracks the movements of the hand. As of last year, Google showed off a prototype that could detect minute gestures with the hand (such as rubbing two fingers together or a tapping motion), which could effectively be used to control items in the home or operate wearables.
Google has written a vast directory of gestures so Soli can pick even the most minute or subtle of gestures with the hand, creating a mind-boggling number of uses should we ever see it in a commercial setting.