iOS 16 brings with it a much-improved dictation experience – one that lets you seamlessly switch between typing and speaking. I’ve already found it to be a massive improvement from iOS 15, but how does it compare to Google’s Pixel 6, which was unveiled last year with a new ‘assistant voice typing’ that brought similar hands-free typing and editing features?
Table of contents
iOS 16 Dictation hands on [Video]
iOS 16’s new dictation features
Apple didn’t spend much of the WWDC presentation going over the new dictation, but I think many users will find the improvements here to be some of their favorites with the new operating system. iOS 16 allows you to seamlessly switch between typing using your voice and using the keyboard, which no longer disappears when you start dictation. You can go back and edit previously written text using your voice and add emoji, and iOS 16 takes care of adding the necessary punctuation – no need to say “period” or “comma” for it to write in proper sentences.
You can see the new dictation experience in action.
Pixel 6’s voice typing
Last year, Google unveiled its own dictation overhaul with assistant voice typing on the Pixel 6 and Pixel 6 Pro. Powered by Google’s Tensor chip, it also brought the ability to seamlessly switch between typing and speaking, with automatic punctuation, emoji support, phonetically based suggestions, and more. Ever since getting my hands on a Pixel 6, I’ve seen it as hands-down the best voice-to-text experience I’ve ever had – but that gap has narrowed with iOS 16.
Apple’s and Google’s voice-typing compared
I did a head-to-head comparison of the two and was pleased with how improved the iPhone was – but it still hasn’t caught up with the Pixel in my experience. It’s also worth noting this isn’t an Apples-to-Apples comparison. I was only using this on an older iPhone 11, with a slower neural engine, so I didn’t take the speed of the typing into account. This is also still a very early developer beta, so there will be changes, improvements, and bug fixes by the time the software actually releases, at which point I hope to do another comparison.
Both of them picked up the emoji and punctuated around the list of names, but neither of them was totally accurate. Both devices messed up when it came to the URLs, but the iPhone’s dictation got more words and phrases just out-right wrong. One place the iPhone did better was in picking the correct spelling of names for the individuals I’m speaking about.
While in my short time hands-on with the developer beta so far it’s clear there’s still room for improvement in iOS 16’s dictation, this is a major step forward. I think it’s also important to note that some users, specifically non-American English speakers, seem to have a worse experience with the Pixel’s voice typing, so your results may vary.
These general dictation features coming with iOS 16 are also accompanied by new voice controls that help make the iPhone more accessible. These include Siri end call support, auto-answer calls setup, and notification announcements.
If you’re running the developer beta, I’d love to hear your thoughts on the updated dictation experience. Let me know in the comments down below, and check out some of the other best features coming with iOS 16.
Author: Derek Wise
Source: 9TO5Google