AI & RoboticsNews

Google’s Project Guideline uses AI to help low-vision users navigate running courses

In collaboration with nonprofit organization Guiding Eyes for the Blind, Google today piloted an AI system called Project Guideline, designed to help blind and low-vision people run races independently with just a smartphone. Using an app that tracked the virtual race via GPS and a Google-designed harness that delivered audio prompts to indicate the location of a prepainted line, Guiding Eyes for the Blind CEO Thomas Panek attempted to run New York Road Runners’ Virtual Run for Thanks 5K in Central Park.

According to the U.S. Centers for Disease Control and Prevention, in 2015, a total of 1.02 million people in the U.S. were blind and approximately 3.22 million people had vision impairment. Technologies exist to help blind and low-vision people navigate challenging everyday environments, but those who wish to run must either rely on a guide animal or a human guide who’s tethered to them.

Google’s Guideline app works without an internet connection and requires only a guideline painted on a pedestrian path. Users wear an Android phone around the waist using the aforementioned harness; the Guideline app runs a machine learning model that looks for the painted line and identifies it. (The model, which emerged from a Google hackathon, accounts for variables in weather and lighting conditions.) Then, the app approximates the user’s position and delivers audio feedback via bone-conducting headphones to help keep them on the guideline. If the user is to the left of the line, they’ll hear audio in their left ear increase in volume and dissonance, and if the user moves to the right of the line, the same will happen in the their right ear.

Project Guideline

“Imagine walking down a hallway in the dark with your arms outstretched. As you drift to the left, you will feel the wall with your left hand and move back to center to correct,” a Google spokesperson told VentureBeat via email. “As you drift to the right, you will feel the wall with your right hand and move back to center to correct. The same applies with Project Guideline, only you hear the boundaries to your left and right, rather than feel them.”

Beyond the pilot with Panek, Google plans to partner with organizations to help paint guidelines in different communities and provide additional feedback.

Project Guideline

Above: Images used to train the Guideline model.

Image Credit: Google

The launch of Guideline comes after Google debuted more in-depth spoken directions for Maps, which inform users when to turn and tell them when they’re approaching an intersection so they can exercise caution when crossing. The company also continues to develop Lookout, an accessibility-focused app that can identify packaged foods using computer vision, scan documents to make it easier to review letters and mail, and more.


Best practices for a successful AI Center of Excellence:

A guide for both CoEs and business units Access here



Author: Kyle Wiggers
Source: Venturebeat

Related posts
AI & RoboticsNews

Mike Verdu of Netflix Games leads new generative AI initiative

AI & RoboticsNews

Google just gave its AI access to Search, hours before OpenAI launched ChatGPT Search

AI & RoboticsNews

Runway goes 3D with new AI video camera controls for Gen-3 Alpha Turbo

DefenseNews

Why the Defense Department needs a chief economist

Sign up for our Newsletter and
stay informed!