MobileNews

Nest Hub Max testing aptly named ‘Blue Steel’ feature to trigger Assistant without hotwords

On Monday, Google announced and started rolling out a major visual overhaul of the Smart Display UI. That redesign leaked out early, and the same source is back with a new capability. The Nest Hub Max is testing a ‘Blue Steel’ feature that obviates the need for saying the “Hey Google” hotword.

Another YouTube video from Jan Boromeusz today shows a “Dogfood Features” section in Nest Hub Max settings that allows Googlers to preview in-development functionality.

The clip goes on to show “Blue Steel” in action where commands can just be spoken without first saying the hotword. The Assistant dots appear in the top-left corner moments before commands are spoken.

However, it’s not clear whether the Hub Max’s camera or ultrasound sensing is being leveraged. Boromeusz just says that Assistant starts listening “after detecting our presence near the device.” 

That would point to the latter technology where the Smart Display emits inaudible sound waves that bounce off objects and are picked back up by the device’s microphones. Ultrasound sensing today used to show more details and touch controls. For example, when you approach while audio is running, the Now Playing screen will surface play/pause, but otherwise hide those buttons to give central focus to cover art.

This approach would also mean that other devices like the Nest Hub and possibly third-party Smart Displays get this capability down the road.

Another possibility is the camera on the Nest Hub Max, which today recognizes when users hold up their palm in order to pause playback. This theory — where the camera is looking for your face before listening — is also backed up by the codename. Blue Steel is a reference to Zoolander where Ben Stiller’s character has an iconic look. 

This approach would be more precise than just having users near the device. If you’re looking directly at the Smart Display, you’re more likely than not getting ready to ask a question. The person in this video is obviously close by holding their phone to record, but the Assistant dots do not instantaneously appear, suggesting that there is another trigger — presumably looking directly at the screen — than just presence.



Author: Abner Li
Source: 9TO5Google

Related posts
AI & RoboticsNews

Meta’s new multi-token prediction makes AI models up to 3X faster

AI & RoboticsNews

Espresso AI emerges from stealth with $11M to tackle the cloud cost crisis

DefenseNews

How to further strengthen the Defense Production Act

DefenseNews

Divisions in the Dirt: The Army’s plan for the next big war

Sign up for our Newsletter and
stay informed!