MobileNews

iOS engineers detail Apple’s approach to improving accessibility with iOS 14

Apple just gave an overhaul to its accessibility landing page to better highlight the native features in macOS and iOS that allow user’s devices to “work the way you do” and encourage everyone to “make something wonderful.” Now a new interview with Apple’s accessibility and AI/ML engineers goes into more detail on the company’s approach to improving accessibility with iOS 14.

iOS accessibility engineer Chris Fleizach and AI/ML team member Jeff Bigham spoke with TechCrunch about how Apple thought about evolving the accessibility features from iOS 13 to 14 and how collaboration was needed to achieve these goals.

One of the biggest improvements with iOS 14 this fall when it comes to accessibility is the new Screen Recognition feature. It goes beyond VoiceOver which now uses “on-device intelligence to recognize elements on your screen to improve VoiceOver support for app and web experiences.”

Here’s how Apple describes Screen Recognition:

Screen Recognition automatically detects interface controls to aid in navigating apps

Screen Recognition also works with “on-device intelligence to detect and identify important sounds such as alarms, and alerts you to them using notifications.”

Here’s how Apple’s Fleizach describes Apple’s approach to improving accessibility with iOS 14 and the speed and precision that comes with Screen Recognition:

“We looked for areas where we can make inroads on accessibility, like image descriptions,” said Fleizach. “In iOS 13 we labeled icons automatically – Screen Recognition takes it another step forward. We can look at the pixels on screen and identify the hierarchy of objects you can interact with, and all of this happens on device within tenths of a second.”

Bigham notes how crucial collaboration across the teams at Apple were in going beyond VoiceOver’s capabilities with Screen Recognition:

“VoiceOver has been the standard bearer for vision accessibility for so long. If you look at the steps in development for Screen Recognition, it was grounded in collaboration across teams — Accessibility throughout, our partners in data collection and annotation, AI/ML, and, of course, design. We did this to make sure that our machine learning development continued to push toward an excellent user experience,” said Bigham.

And that work was labor-intensive:

It was done by taking thousands of screenshots of popular apps and games, then manually labeling them as one of several standard UI elements. This labeled data was fed to the machine learning system, which soon became proficient at picking out those same elements on its own.

TechCrunch says don’t expect Screen Recognition to come to Mac quite yet as it would be a serious undertaking. However, with Apple’s new Macs featuring the company’s custom M1 SoC, they have a 16-core Neural Engine that would certainly be up to the task – whenever Apple decides to expand this accessibility feature.

Check out the full interview here and Apple’s new accessibility landing page. And check out a conversation on accessibility between TC’s Matthew Panzarino and Apple’s Chris Fleizach and Sarah Herrlinger.


Check out 9to5Mac on YouTube for more Apple news:

Check out the latest Apple iPhones at great prices from Gizmofashion – our recommended retail partner.


Author: Michael Potuck
Source: 9TO5Google

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!