MobileNews

Google ATAP imagines how Soli could be used on smart displays/tablets in the future [Video]

Soli radar today is found on the second-gen Nest Hub and the $129 Nest Thermostat after previously debuting on the Pixel 4. Google’s Advanced Technology & Projects (ATAP) group today showed off other potential ways Soli could be used by future products.

On the Pixel 4, Soli was used to speed up face unlock and gesture control. The latter is also used on the Nest Hub, while sleep tracking is the other big feature. The Nest Thermostat, meanwhile, leverages the radar technology to wake the screen on approach.

ATAP in a documentary series today restated its goal “to create ambient, socially intelligent devices that are controlled by the wave of a hand or turn of the head.” 

As humans, we understand each other intuitively — without saying a single word. We pick up on social cues, subtle gestures that we innately understand and react to. What if computers understood us this way?

Google wants devices to understand what it calls the “social context” of an environment with Soli-recognized nonverbal behaviors like “approach and leave,” “turning toward/away,” and glance. This includes understanding “when someone is approaching or entering its personal space.” 

In practice, the company showed off a wall-mounted smart display in the shape of a square that’s mounted near the front door of a house. When nobody is around, it primarily shows the temperature, while the forecast is also noted but using smaller text. On approach, the forecast takes up the entire screen. 

Another example is a slightly bigger square display that notes what tune is playing and when you get new notifications. When you turn your head on the screen, the full text of the message alerts appears. The full music UI returns when you stop looking. 

Lastly, there are two examples with a tablet/kitchen smart display. Like the last example, an incoming video call automatically gets answered when you walk up to the device. Similarly, walking away from the tablet as a video plays will automatically pause. 

The device, which looks to use e-ink, being used to demo these interactions is clearly a mockup, but the Assistant Smart Display parallels are clearly there. In terms of how close we are to this being reality, the Nest Hub already uses Ultrasound sensing to increase the size of UI elements (e.g., timer) or skip to the homescreen on approach.



Author: Abner Li
Source: 9TO5Google

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!