To recall, Google had briefly touched upon Project Soli last year during the I/O conference though the company made no further announcement about the project since last year. The company showed off the new Soli chip which incorporates the sensor and antenna array into an ultra-compact 8x10mm package.
Dan Kaufman, Director of Google's ATAP team, during the keynote revealed some improvements to the Project Soli chip brings including reduced power consumption, roughly 5 percent improvement from the original chip. The new Soli chip also reduces computational power. These improvements finally make it usable in consumer facing products, though the company has not revealed a timeline for launch.
Project Soli is a new way of touchless interactions. It is a sensing technology that uses miniature radar to detect touchless gesture interactions. Talking about some of the devices where Project Soli can be embedded include wearables, phones, computers, cars and IoT devices. Google at the I/O 2016 also showed new concept hardware made in collaboration with LG and Harman.
Google showed a concept smartwatch made using a LG Watch Urbane featuring Soli chip at I/O which worked completely based on gestures and could track small movements like waving of fingers. Ivan Poupyrev, Technical Program Lead at Google's ATAP, previewed some virtual tool gestures such as for interacting with the smartwatch users will need to use hands from a distance which will allow them to scroll through messages, and by pulling hand next to the watch users to interact with the watch. The watch basically responds to the proximity of the hands to the watch. At I/O 2016, Google also showed a speaker from JBL by Harman which featured Soli chip and allowed hand gestures to control the music from the speaker.
"Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand," explained Poupyrev. He claimed that the Soli's sensor can track sub-millimeter motion at high speeds with great accuracy.
"We're creating a ubiquitous gesture interaction language that will allow people to control devices with a simple, universal set of gestures," added Poupyrev.
"Imagine an invisible button between your thumb and index fingers - you can press it by tapping your fingers together. Or a Virtual Dial that you turn by rubbing thumb against index finger. Imagine grabbing and pulling a Virtual Slider in thin air. These are the kinds of interactions we are developing and imagining," details Google's Project Soli page.
Poupyrev said the Soli sensor technology works by emitting electromagnetic waves in a broad beam. When objects within the beam scatter the energy, it reflects some portion back towards the radar antenna.
Kaufman at the Google I/O ATAP session also revealed that the company had shipped Soli Alpha Dev Kits to select developers last year and also confirmed that announcements about next Dev Kit application can be expected in fall 2016. For more technical details, you can watch Google I/O session on ATAP here.
As for Project Jacquard, Google and Levi's unveiled the first consumer facing product with the technology at I/O 2016 - the Levi's Commuter Trucker Jacket. It will be available in a limited beta this year, before being launched for the general public next year.