Google’s Project Soli Uses Hand Gestures
That’s right. Google wants to push past the touchscreen and bring in hand gestures, which humans commonly use while communicating.
The idea first came about 12 years ago when Steve Jobs introduced the iPhone with a multi-touch screen user interface (UI). He emphasized the power in using our fingers. Now, Google sees the potential in hand gestures controlling our electronics.
Google’s UI project, called Project Soli, uses radar to control electronics with in-the-air hand gestures. Soli emerged from Google’s Advanced Technology and Projects group (ATAP).
Soli enables in-the-air hand gestures to manage smartphones, computers, wearable devices, and even cars.
The Future of Soli
The project is not necessarily new since its first announcement in spring 2015. However, it has been in recent news because the FCC granted permission for Google to operate Soli radar sensors at higher powers than the US currently allows. The FCC also permitted using Soli devices on airplanes.
Since the ATAP tends to move aggressively from idea toward productization, can we expect to see Google’s Soli on the market soon? We will have to wait and see.