If the leaks are correct, you will be able to use the next 4-pixel phone without touching or talking to Google.

The company is working on developing its own "Project Sole" radar chip for adoption on its next flagship phone.

The technology allows users to use graceful hand or finger movements to control virtual buttons without having to touch the device. The technology has received a special permit from the Federal Communications Commission (FCC) to test it as it requires a stronger electromagnetic signal than is currently required.

The project's founder, Project Sole, Van Poperev, has been working on the technology at Google's Advanced Technology and Enterprise Group since 2015 and was first unveiled at its 2016 Annual Developers Conference (see video below).

According to Boberev, the technology can sense the smallest movement and work in everything, close to small devices such as smart clocks, or from all over the room in large devices such as speakers or televisions.

However, according to the Google 9 to 5 site, the initial application of this technology will be in my phone 4 pixels and the next 4XL pixels.

These rumors confirmed the site "XDA", which found a code in the operating system "Android Q" reveal that Google is working on new gestures require a sensor "perception." The site also says that the sensor may be employed for the first time in a pixel phone.

Will Solly be useful?
The practical value of this technique is not yet clear. But waving your hands like the "Jedi" warriors to make things happen without having to touch anything is fantastic from the point of view of future innovation enthusiasts.

On the other hand, your smartphone is naturally in your hands most of the time. Why should you rub your thumb and forefinger to raise the volume or skid when you can do so with a touch of your finger on the screen? When your phone is charging on the table, is not it easy to just say "Hi Google" to ask the smart voice plugin to increase the volume?

That's why the Tomms-Gayde site says that this technology will only add a layer of user experience complexity to a range of touch conventions that are already complex enough to prevent them from reaching their full potential, at least for most users.

So far attempts to use hand gestures have been unsuccessful as a user interface in smartphones. LG tried it on its G8 and ended up being an exotic feature. The same is true of Samsung, which adopted the "aerodynamics" technology years ago on the Galaxy S4 phone.

But Google's implementation of this technology may be too polished and useful to be attractive to consumers beyond the first five minutes of traditional primary excitement. On the other hand, it may be another version of Apple 's 3 - D touch experience, which also added a hidden layer of complexity that most people did not use, and reports that Apple will abandon it because most users do not care.