When you observe technology, you observe a pattern. The adoption of every new hardware platform was facilitated by an interface custom made for that platform, so as to enable ease of use and convenience.
Currently, the industry that is in a nascent stage is the Wearable Industry. Wearables haven’t really took off, and aren’t selling too well. Once again, wearables are a relatively new product, and with the right software, wearables can become more popular.
Most wearables today rely on a small touchscreen for navigation, and therein lies the problem. Touchscreens were designed with large devices in mind. When you have a screen that can be covered by your finger, interacting with it via the screen is just inefficient.
Google seems to be coming up with a solution, with Project Soli. The solution Google proposes is to track small finger movements which translate to touches This means that you can scroll just by rubbing your fingers together, which is stangely intuitive.
Project Soli does all this without losing Battery Life. It works by broadcasting Beam Radar, and then monitoring changes in order to know when your fingers are doing a gesture.
It operates at 60 Ghz at 10 frames per second on average.
Currently, Soli is the side of an SD-Card. Google is looking to shrink it down further. Google is also looking to get all circuitry needed for Soli to become independent, further improving compatibility.
Google expects Soli to find its way to wearables over the next few years, as Google improves power-consumption and other factors.
What do you think? Can this solve the Wearable Interface problem?