AI & RoboticsNews

Google’s Action Blocks are shortcuts for building Google Assistant commands

Touchscreens and swipe gestures pose a challenge for the roughly 630 million people in the world with a cognitive disability. Executing these tasks requires coordinated motor movements that some with disabilities aren’t reliably able to make. And it usually involves the recalling of some piece of information (like an address or a phone number), a step those with severely impaired memory can’t perform easily.

Making mobile apps more accessible was the motivating force behind Action Blocks, a Google framework that uses the tech giant’s intelligent assistant — Google Assistant — to kick off actions with the tap of a home screen shortcut. Said shortcut can be customized with an image of the user’s choosing, which serves as a visual cue.

“Think about the last time you did something seemingly simple on your phone, like booking a ride-share. To do this, you had to unlock your phone, find the right app, and type in your pickup location,” wrote Google accessibility software engineer Ajit Narayanan. in a blog post. “We’ve been experimenting with how the Assistant and Android can work together to reduce the complexity of these tasks for people with cognitive disabilities.”

Google Assistant Blocks

Action Blocks can be linked to any corresponding Assistant action, like ordering a ride-share. In fact, they can be configured to do anything the Assistant can do, including (but not limited to) queuing up a show, controlling connected lights, or calling a family member.

They’re akin to Apple’s Siri Shortcuts in that respect, an iOS feature that lets users quickly perform preprogrammed tasks with just a tap or by asking Siri. But unlike Siri Shortcuts, Action Blocks aren’t widely available just yet. They’re in testing as a part of Google’s Accessibility Trusted Tester program —  interested developers can sign up from the relevant webpage.

“Action Blocks is the first of our many efforts to empower people with cognitive disabilities, help them gain independence, connect with loved ones and engage in the world as they are,” wrote Narayanan.

The launch of Action Blocks comes after Google detailed Parrotron, an ongoing research initiative that aims to help those with atypical speech become better understood. That reveal followed on the heels of three separate accessibility efforts at Google’s I/O 2019 developer conference: Project Euphonia, which aims to help people with speech impairments; Live Relay, which is designed to assist deaf users; and Project Diva, which gives people some independence and autonomy via Google Assistant.

At I/O, Google pointed to a few metrics from the World Health Organization to back its efforts: Over 1 billion people, or 15% of the population, live with some sort of disability.


Author: Kyle Wiggers
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!