AI & RoboticsNews

Google’s Look to Speak taps gaze-tracking AI to help users with impairments communicate

Google today launched an experimental app for Android that leverages AI to make communication more accessible for people with speech and motor impairments. Called Look to Speak, it tracks eye movements to let people use their eyes to select prewritten, customizable phrases and have them spoken aloud.

Approximately 18.5 million people in the U.S. have a speech, voice, or language impairment. Eye gaze-tracking devices can provide a semblance of independence, but they’re often not portable and tend to be expensive. The entry-level Tobii 4C eye tracker starts at $150, for instance.

To address this need, speech and language therapist Richard Cave began collaborating with a small group at Google to develop Look to Speak. The app, which is available for free and compatible with Android 9.0 and above, enables users to glance left, right, or up to select what they wish to say from a phrase list.

Android Look to Speak

Above: Google’s Look to Speak app uses front-facing cameras and gaze-tracking AI to recognize intent.

Image Credit: Google

With Look to Speak, people can personalize the words and sentences on their list and adjust eye gaze sensitivity settings. Google says the app’s data remains private and never leaves the phone on which it’s installed.

Look to Speak is a part of Google’s Start with One initiative on Experiments with Google, a platform for AI projects and experiments created by Google engineers and other developers. But the company’s work in AI for accessibility extends beyond one ecosystem.

In early October, Google announced it would bring Google Assistant to Tobii Dynavox’s apps and services, enabling users with accessibility needs to assign virtual tiles to actions that control smart home devices and appliances. The companies also partnered to integrate Action Blocks, UI elements that make it easier for people with cognitive disabilities to use Google Assistant, with Tobii’s thousands of picture-based symbols for making calls, sending texts, playing video, and more.

In November, Google unveiled Project Guideline, which uses AI to help low-vision users navigate running courses. And Google is continuing to develop Lookout, an accessibility-focused app that can identify packaged foods using computer vision, scan documents to make it easier to review letters and mail, and more. There’s also Project Euphonia, which aims to help people with speech impairments communicate more easily; Live Relay, which uses on-device speech recognition and text-to-speech to let phones listen and speak on a person’s behalf; and Project Diva, which helps people give the Google Assistant commands without using their voice.


Author: Kyle Wiggers
Source: Venturebeat

Related posts
GamingNews

Arc Raiders Has Proven That Extraction Shooters Have ‘Huge Potential,’ PUBG Boss Says — and That’s Why Black Budget Exists

GamingNews

Halloween Cuts Deeper Than Other 4v1 Games | IGN Preview

GamingNews

Replaced, Heroes of Might & Magic: Olden Era, The Secret Life of Goldman, 2 Other Developers Unite to Form 'Nova Assembly' Co-op

CryptoNews

Morgan Stanley MSBT Bitcoin ETF Launch Draws $34M

Sign up for our Newsletter and
stay informed!