MobileNews

iOS 16 accessibility features likely to benefit everyone, argues Macworld

Apple yesterday previewed some iOS 16 accessibility features we can look forward to, including live captions, Apple Watch mirroring, and door detection.

It’s often been argued that accessibility features in general can benefit everyone, not just the target group, and Macworld argues that this is the case for all three of the above features …

The case for accessibility benefiting all

There are two arguments for accessibility features benefiting everyone.

The first is that it forces designers to think in new ways, and that kind of creativity often sparks ideas that otherwise might have been missed. The second is a positive version of the law of unintended consequences.

For example, closed captions on videos were initially designed for people who are deaf or hard of hearing, but now many people like to use captions – whether it’s to help with accents, or simply to be able to watch shows with the sound off.

Automatic doors were first intended for use by those who use wheelchairs and crutches, but now also benefit shoppers who have their hands full.

Screen readers were intended to help blind and visually impaired people, but things like having incoming messages read to us when we’re driving has been a big benefit to many.

Upcoming iOS 16 accessibility features

Live captions, for example, is great for deaf and hard-of-hearing people, but better audio transcription is something we could all use, says Macworld’s Jason Cross.

It’s a natural extension of the on-device speech processing that was introduced last year in iOS 15, but it speaks to a big improvement in the sophistication of that feature.

We hope this means an improvement in Siri’s understanding of your commands and dictation, but one could easily see these features show up in other places. Take for example, the Notes app, where one can imagine a “transcribe” feature to create text out of any audio recording or video. If Apple’s billing it as an accessibility feature, Live Caption’s transcription will need to be rock-solid, and it opens up a world of possibilities for the rest of iOS 16.

Door detection should lead to better object-recognition, with benefits for AR applications. But it’s Apple Watch mirroring that Cross thinks could be most exciting.

Notably, this seems like it allows devices to communicate control intent in a way that AirPlay doesn’t right now. AirPlay pushes audio and video out to devices, and allows for simple controls (play/pause, volume, and so on), but allowing AirPlay-compatible devices to signal advanced touch controls seems new and could lead to some incredible new features.

Here’s a killer scenario: If Apple can mirror your Apple Watch to your iPhone and allow you to fully interact with it, it could probably mirror your iPhone to your Mac or iPad and do the same! That alone would be a game-changing feature.

9to5Mac’s Take

We’re definitely in the camp of accessibility features being good for everyone, and Cross makes a good case for these examples.

The latter in particular. Yes, Catalyst apps are one approach, but I can definitely think of times when the easiest way to get something done would be to simply mirror my iPhone to my Mac to control it there – even if it’s as simple as using my Mac Magic Keyboard in an iPhone app without the hassle of disconnecting the Bluetooth connection to the Mac, connecting to the phone and back again.


Check out 9to5Mac on YouTube for more Apple news:


Author: Ben Lovejoy
Source: 9TO5Google

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!