Google recently unveiled a tiny new chipset, codenamed Project Soli, which detects gestures and movements with radar. Infineon Technologies, which developed the hardware, is working with Google to use it in various Internet of Things (IoT) devices like smartwatches, fitness bands, and driver-assistance systems.
Continue Reading Below
The Project Soli chipset. Source: Google.
During its annual I/O developer conference in San Francisco, Google showcased Soli's ability to track a hand's position and movements in space. Since it's powered by radar instead of cameras, the chip can detect movements through other objects, which means that it can be embedded inside various objects.
This is neat technology on its own, but Soli could have some fascinating applications when combined with Google's other projects.
During I/O, Project Soli founder Ivan Poupyrev demonstrated how the chipset could be used to control smartwatches without touching their surfaces.
Continue Reading Below
This isn't really a new idea -- Deus Ex Technology recentlyunveiled the Aria clip for smartwatches, which lets users use finger gestures instead of touching the screen. Some smartphones also let users answer prompts and input basic commands without touching the screen. However, the Aria only detects movements of the wrist, and smartphones use proximity sensors to detect broad gestures.
With Project Soli, embedded chips can detect smaller and more precise movements near various objects. This means that it could be possible to click icons or play games on smartwatches, smartphones, and tablets without ever touching the screen.
Controlling a smartwatch with Soli. Photo: Google.
Augmented and virtual reality
Google experimented in augmented reality with Google Glass, and invested over half a billion dollars in Magic Leap, a developer of holographic augmented reality experiences. It also introduced Cardboard, a DIY virtual reality headset, to encourage the development of VR apps for smartphones.
However, Cardboard can't track walking movements through virtual space, and users can't see their own hands and fingers in the virtual environment. Facebook's Oculus VR users can experience movement with 360-degree treadmills, and can use Intel's RealSense depth-sensing cameras to see their own hands.
That's why Google introduced Project Tango at I/O 2015. Similar to Oculus' fusion of Rift and RealSense, Tango uses depth-sensing cameras to render real people and objects in a VR environment. For example, two users wearing the VR tablet/headsets with Tango software can "see" each other as avatars, walk in a virtualized version of a real room, and even touch objects and each other. With Project Soli, people in those digital environments might also be able to touch and manipulate virtual objects or controls.
Project Soli could also be integrated into car control systems, allowing drivers to interact with their vehicles with gestures. Simple finger gestures could replace turn signal, horn, and windshield wiper controls on the steering wheel. Windows could be opened and air conditioning could be adjusted with a simple gesture in the air. Google might eventually combine facial/fingerprint recognition with Android Auto, and drivers could simply snap their fingers to start the ignition.
Adding futuristic gesture controls to cars would be appealing to automakers like BMW, Volvo, and Hyundai -- which have all introduced apps to let drivers control their vehicles from their smartwatches. That complements the growth of Android Wear and Android Auto, and could pave the way for Google's driverless cars, which can be summoned by smartphone, to hit the roads.
BMW's smartwatch app. Source: BMW.
Why this matters to Google
These potential applications all sound interesting, but investors might wonder how they could bolster Google's core business of Internet advertising.
It's all about ecosystem growth on top of the IoT market. Cisco forecasts that the number of connected devices worldwide will double from 25 billion in 2015 to 50 billion in 2020. This means that data won't just come from PCs or mobile devices anymore -- it will come from wearables, augmented reality devices, cars, and everyday objects like coffee makers and refrigerators. That's why Google launched Brillo, its operating system of IoT devices, and the reason it introduced Project Soli and Project Jacquard (the latter is a "smart fabric" that can remotely control mobile devices).
If these projects help Google reach into smart homes, cars, clothes, and appliances, it will be able to gather more data to improve search results and targeted advertising. By doing so, Google can strengthen its core business while expanding its business beyond its search box.
The article Google, Inc.'s Project Soli: Mobile Tech, Virtual Reality, and Driverless Cars originally appeared on Fool.com.
Leo Sun owns shares of Apple and Facebook. The Motley Fool recommends Apple, Cisco Systems, Facebook, Google (A shares), Google (C shares), and Intel. The Motley Fool owns shares of Apple, Facebook, Google (A shares), and Google (C shares). Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.
Copyright 1995 - 2015 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.