Google’s Project Soli comes to the smartwatch

Wearables are stuck in a sort of limbo lately, and whilst they did come back from the dead with Pebble’s Kickstarter campaign in 2012. But even after a few years of iterations and launches of platforms from Google and Apple, the space has failed to get any real traction among consumers.

One of the main reasons behind this is that there has been no ‘golden moment’ for a smartwatch interface. For instance, the iPhone was a hit because it brought capacitive touchscreens to mobiles, and that seemed like a match made in heaven. Put those touchscreens on tablets, and they’re still effective. But shrink it down to a smartwatch, and you have a display that’s just a bit larger than your thumb. There, a touchscreem display is not at all dieal of r effective, yet it is the display of choice in most wearables nowadays.

There hasn’t really been a revolutionary Goldilocks moment for interacting with smartwatches, and having one is essential if the industry is to resonate among consumers.

And Google’s Project Soli took a step forward at I/O 2016, proposing a solution to the problem: Radars. The team behind Soli says that putting radars on a smartwatch can help change the point of contact from the small screen of a smartwatch to the area around it. This allows for much more flexible and thus more effective interactions.

One of the problems faced  by the team was in shrinking down the technology needed to achieve it. At small sizes, detecting your hands alone is difficult, let alone tracking individual fingers. But they somehow managed to do it, and have even made a universal system that will go wherever Soli goes, to ensure consistency and to make users familiar with the technology.

From afar, you can make broader gestures, and as you get closer, you can do finer, more complicated gestures. And the gestures themselves are rooted in the real world, in order to help you to get familiar with a new technology. There’s nothing convoluted going on here.

The team behind Soli says that this kind of gesture interaction system is ultimately more fluid, and makes using wearables more bearable.

Right now, the concept has been realized, but in a crude manner. The team is working on shrinking the technology down further, and making it more power-efficient. Bugs are to be ironed out, and further optimizations need to be done. We may see this technology pop up in smartwatches in the next few years.

What do you think?

Advertisements

What are your Thoughts? Speak your Mind!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s