Last Updated: 11th July, 2019
Project Soli, Remastered: Well, for ya, there’s something out of science fiction right here: If a pair of recent rumors is to be believed, Google may be planning to include a futuristic radar chip in its next Pixel 4 phone – possibly to allow for a wild new kind of touch-free gesture controls.
Take a step back and just let it sink in. Crazy stuff, right?
Let’s be clear now: the chip as a whole is truly genuine – no doubt about this now. Google’s been talking about the thing since 2015, in fact, as part of its Motorola-born Advanced Technology and Projects (ATAP) group. That’s the same group that came up with Google’s now-defunct modular smartphone system, Project Ara, as well as the also-abandoned Project Tango program that aimed to create a new kind of augmented reality platform.
That concept is in line with a distinct XDA Developers gang analysis, which found some code in the Android Q beta software pointing to support at the OS level for a sequence of top-secret gestures, not conventional Android Q display-based gestures, mind you, but a whole new category of hands-in-the-air movements that require a special “aware sensor” in order to be recognized.
Oh, and earlier this year, Google got a “waiver” from the FCC that gave it permission to use this same Soli sensor at a greater frequency than permitted by present guidelines. In its report, the FCC said the move would “be using “Touchless” hand gesture technology to serve the public interest by offering innovative device control capabilities.”
Good golly, Miss Soli, there sure is a lot going on here. Let’s step back for a second and explore this whole thing a little more closely so we can get the full context of what’s actually up and what might be looming in the not-so-distant future.
Project Soli’s Unpretentious Start
We must begin at the start: as part of its I/O developers’ meeting started in May 2015, Google first took the wraps off its Project Soli, ERM, project. At the time, the concept seemed a bit far-fetched – like another one of those lab-based ideas that’d blow us away in a demo but never make its way to the real world.
The folks behind Project Soli outlined in a video describing the process how the chip would be used radars to monitor even the slightest hand movements – “micromotions” or “quivers” and then use these motions to communicate with various kinds of virtual interfaces. The system, they explained, was designed to “extract specific gesture information” from the radar signal at a “high frame rate.”
Translated into non-geek-speak, that means the chip can sense precisely how you’re moving your hand – making a twisting-like motion as if you were turning a volume knob up or down, for instance, or tapping your thumb and index finger together as if you were tapping a button – and then perform an action on your device that’s mapped to that specific movement.
Seriously, take 4 minutes to watch this. Trust me It’ll just blow your mind.
Project Soli is developing a new interaction sensor using radar technology. The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects. Follow Google ATAP on Twitter for updates on Project Soli: https://twitter.com/GoogleATAP Visit https://groups.google.com/forum/#!forum/soli-announce to sign up for updates.
That’s all cool, of course, in that Google I/O developer-session-demo sense – the sense that you assume it’s something you’ll never actually see or use in your regular-person life (or at least not anytime soon). When you add in the possibility of this technology showing up in this fall’s Pixel 4, though, it takes on a whole new meaning.
And there’s more to the Soli story yet.
The Soli Sensor Evolution
About a year after its debut, Project Soli showed up again -this time, in a session at the next year’s Google I/O conference, in 2016. That year, the Soli team announced it had shrunken down and optimized the chip to render it tiny and effective enough to operate on conventional smartwatch hardware – a substantial step from those in the traditional supercomputer power that most radar technology requires.
“Once you get something running on a smartwatch, you can get it running wherever you want,” stated Soli bigwig Ivan Poupyrev.
Poupyrev and his associates went on to note that the Soli chip ran on Android software and already worked not only with watches but also with phones and home entertainment devices. Furthermore, the radar can really sense gestures up to 15 meters, which is about 49 feet (!) distant. And critically, they pointed out that the idea was not to replace existing forms of interaction, but to provide an additional option on top of the more mundane methods we all already knew.
“It provides a third interaction dimension that accentuates and enhances other modalities of interaction like those of touchscreen and speech input,” quipped Poupyrev. “We don’t fight with them. We work together.”
The full presentation is pretty long, but there’s a demo halfway through that’s well worth watching. I’ve got it cued up for you here:
One bit that struck me was the gesture shown for Holding your hands up in order to stop a speaker from playing. The gesture integrated in the lately announced Google Nest Home Max Smart Display (gesundheit!) is strangely reminiscent of-almost identical, indeed:
KTLA Tech Reporter Rich DeMuro goes hands-on with the $229 Google Nest Hub Max smart display with facial recognition. It will go on sale in July. Rich’s Book: 101 Handy Tech Tips for the iPhone https://amzn.to/2VQugSX
The Nest Home Max appears to be using a standard camera to identify the gesture, but the resemblance in execution seems to be terribly coincidental – particularly when you consider what all the Soli team has continually spoken about building a “universal set of gestures” you can use to “manage any device around you.”
Soli’s Next Steps
So then what could really Project Soli as well as its radar technology for gesture-sensing does? From the looks of it, plenty. An exploration from the University of St Andrews in Scotland shows Soli performing tasks like counting cards, sensing compass orientation, and analyzing patterns of blocks in Lego towers.
Official PR: https://news.st-andrews.ac.uk/archive/radar-sensing-transforming-the-way-we-interact-with-computers (Project Solinteraction) Exploring Tangible Interactions with Radar Sensing, IMWUT2018 Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 4, Article 200. Publication date: December 2018 https://dl.acm.org/citation.cfm?id=3287078 Based on RadarCat https://www.youtube.com/watch?v=B6sn2vRJXJ4 In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identifcation of objects and tracking the orientation, movement and distance of these objects.
“The method of sensing remains quite consistent, (but) major input is the extensive exploration of… the ordering, stacking, counting, movement, and orientation of various objects,” the experts reported The Verge this year.. The chips could even be integrated into smart home systems to monitor particular objects within a house and detect if something ever changes about them.
Project Soli And The Pixel 4
So placing everything in view, what can we expect if all this Soli chip actually gets into the Pixel 4 phones? The clues found within the Android Q software suggest there might be gestures for commands like silencing music or skipping tracks, but it’s hard to imagine the effects of this technology ending there.
Having a look at all of the demos and ideas proposed in the earlier start Soli materials, there seems to be a myriad of options just waiting to be explored. And if this technology truly is ready to make its way into widely available hardware, it only makes sense that it’d find its way into more than just the Pixel phone.
Google Soli’s team has talked over and over again about the technology that works with phones, computers, speakers, wearables, and perhaps even cars. And guess what? In some capability, Google has a hand in all these fields – hence, it does not seem like much of a hurry to consider Soli and its radar-detected gesture module becoming a popular thread across the whole of many of the company’s overall future products, provided the Pixel 4 debut pans out as expected.
As for how useful it’d actually be, that’s a question only time will answer. “Cool” and “logical” never go concurrently and automatically, and many eye-catching functions well into the true world come out feeling more gimmicky than useful. But Soli’s suggested prospects definitely seem more valuable than any of those offered by the dumb and sparse “air gesture” technologies that we have seen before on Android phones.
Most importantly, the nature of the chip and its radar technology implies that motions can be recognized by means of fabrics – which suggests the gestures could work, at least in theory, even if the associated device is tucked away in a pocket or a purse. And “even though these controls are virtual,” Google has said, the interactions “feel physical and responsive” – with feedback “generated by the haptic sensation of fingers touching each other.”
This breakthrough could prove the strength of the already-relatively-young budding philosophy of Google’s hardware.
From a broader view, what is particularly interesting is how this breakthrough could prove the strength of the already-relatively-young budding philosophy of Google’s hardware. From the beginning, we discussed on how the Pixel was and is far more than the average of its elements and how the entire “holistic,” end-to-end surveillance of the entire user experience is just what Google is really gaining from developing its own equipment.
If this Soli stuff does in fact show up in the Pixel 4 this fall – and if it proves to be as effective and practical in the real world as it appears in these demos – we may see the biggest indication yet of how that approach could ultimately pay off, not only for Google but also for us as humans who carry and rely on its products.
As well as aside from software support, the Pixel phones could end up with the killer feature they really have to set them apart from all the remainder of the smartphone pack and stand off from the squirming, tech-for-the-sake-of-tech forms of “differentiation” most of the industry is currently attempting.
Heck, it might almost be enough to offset the obnoxious nature of all the sales-driven, as we’ve frequently seen user-hostile alterations in smartphone hardware since late. Nearly – and perhaps.