MobileNews

Quadriplegic Apple user rates iOS 15 and watchOS 8 for accessibility

Quadriplegic Apple user and disability campaigner Colin Hughes last year shared his experiences of using Apple kit as someone who uses a wheelchair and has only very limited use of his hands. He’s now followed up with a look at the accessibility improvements Apple made in iOS 15 and watchOS 8.

He highlights the progress Apple has made, as well as the weaknesses in its accessibility support, including a couple of very surprising failings…

Hughes uses an iPhone 13 Pro, Apple Watch Series 7, and AirPods 3. He told me the biggest win for him has been auto-answer for phone calls on the Apple Watch.

As you will no doubt know, a significant part of my advocacy over the past 3 years has been calling on Apple to introduce Auto-Answer calls to the Apple Watch. At last, in watch OS 8, it has arrived!

Turned on the functionality for the first time and received a phone call on my wrist that was clear to me, and the caller, and – as my disability means I can’t touch the Watch screen – I didn’t need to do anything to handle the call effectively.

This brings a level of convenience, security and accessibility that is so important to people like me with severe upper limb disabilities. It was a special moment when I first had the chance to try it.

However, he also noted a very surprising failing – you can’t use Siri to switch on the auto-answer feature!

It is beyond ironic that auto answer, a feature designed for people who can’t touch the screen, still requires you to touch the screen to toggle it on and off.

Please can users toggle auto answer on and off by Siri voice commands, “hey Siri turn on/off auto answer”, and by setting up Siri Shortcuts. For example, turn on auto answer when I leave home, at a certain time, when I get out of bed in the morning, and so on.

Additionally, there’s what appears to be a bug: auto-answer on the Watch remains active when the watch is on its charger rather than your wrist.

It is a bit unnerving to have auto-answer calls kick in when the Watch is off my wrist and in a bedroom.  It has privacy implications for people who forget to turn off auto answer when off wrist and charging.

Another huge win has been Announce Calls and Notifications.

I can’t overstate how massive this has been for me. Every day, when I am out and about in the city, I am answering calls, sometimes really important calls, effortlessly, hands-free with just “Answer”.

Unless I had auto-answer on (which unfortunately still requires me to remember to ask my carer to switch it on for me) I was never able to answer calls.  This really increases independence for people like me.

Similarly, it has been a joy to have notifications from with third-party apps like Facebook Messenger and WhatsApp, read out to me while wearing Airpods for the first time with Announce Notifications in iOS 15. As someone who can’t pick up and open my iPhone to read messages and notifications this new functionality makes me feel really connected like never before.

I’ve dealt with important Outlook emails, WhatsApp messages, and more besides hands-free, responding and actioning important things promptly, with only what I’m hearing in my ears through the AirPods, and this is really liberating and productive.

However, there is another surprising failing here.

You still can’t hang up a call with a Siri voice command “Hey Siri hang up,” which causes many problems most days, as I can’t press the red button to end a phone call. The good news is I have reason to feel reasonably confident that Apple is listening to this gap in provision, and a solution is coming.

He says Siri continues to improve when it comes to commands.

Siri and Shortcuts keep getting better and better, and faster and more responsive. I can control more of my home devices with my voice, including my front door allowing me to get in and out of my flat: “Hey Siri, open the door.”

But still performs poorly when dictating large amounts of text.

Speech recognition on desktops and laptops generally, both Apple and Windows, is in a bad place at the moment. My productivity is hanging by a thread thanks only to Dragon Professional, the Firefox browser, and being able to run Dragon with Parallels on my Mac.

Sadly, Voice Control has hardly improved this year and remains only good for dictating short, (often error strewn), sentences or two: you couldn’t write a 1000 word blog article, run a business, or write a dissertation with its dictation capabilities. It would take you hours of frustration compared to Dragon.

As an Apple user I am looking enviously at what Google is doing on the Pixel 6 at the moment with the Tensor chip. That’s the kind of sophistication I would like to see Apple provide users with severe physical disabilities who rely on speech recognition on the Mac for work, education and keeping in touch.

I believe access to technology and communication is a human right and speech recognition is my only means of access to communicate with the world, and do grown up things that go much further than dictating “happy birthday” with a heart emoji. Disabled people who rely on voice access deserve better than that.

In May, Apple introduced an AssistiveTouch for Apple Watch feature, which was specifically designed for people with limited upper limb mobility. Here’s what the company had to say about it:

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. 

Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

Hughes says that, for him, it has completely failed to live up to its promise.

I am unable to activate it with the limited muscle power in my lower arms and hands. Apparently I just don’t have enough physical movement in my arms and hands to trigger the accessibility feature. It’s made me question who Apple designed this for because on paper it should be tailor-made for people like me, I am not completely paralysed, and have just enough lower arm and hand movement to wake the screen and clench my fist but apparently this is not enough to make use of AssistiveTouch.

I would imagine a lot of people with upper limb disabilities won’t be able to make use of this technology. I am sure there are ways that it could be tweaked and improved to extend access

He also previously mentioned that the Face ID worked with his CPAP mask all the way from the iPhone X to the iPhone 12, but ceased doing so with the more compact notch tech in the iPhone 13.

If you have accessibility needs, please share your own experiences in the comments.


Check out 9to5Mac on YouTube for more Apple news:

Check out the latest Apple iPhones at great prices from Gizmofashion – our recommended retail partner.


Author: Ben Lovejoy
Source: 9TO5Google

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!