TalkBack 14: Rushed Steps in the Right Direction, but still so far behind

I’ve talked a lot about Android on this blog. I love the idea, and the operating system is nice, but accessibility could be so much more. I’ve had my Galaxy S20 FE (5G), for about a year and a half or so. I’ve seen the changes from Android 11, to 12, and finally to 13. TalkBack has improved steadily over that year and a half, adding Braille support, which I thought wouldn’t come for another five to ten years. Spell checking was added in TalkBack 13. Text recognition in images and unlabeled buttons was added in TalkBack 12.1. Icon descriptions were added in TalkBack 13.

In this article, though, I’ll overview the changes in TalkBack 14, the ones I have access to, that is. I’ll get to that later. I’ll also talk about the problems facing Android that isn’t really about TalkBack, but is more about the accessibility framework, and what apps can and can’t do. So this will be a sort of continuation from my other Android articles, more than just a “what’s new in TalkBack” style article.

TalkBack 14, Lots of Braille

TalkBack 14 is a good iteration of where TalkBack 13 started. TalkBack now has many more commands, both for Braille displays and for its Braille keyboard. One can now move around an edit field by words and lines, not just characters, using the onscreen Braille keyboard. One can also select text, and copy and paste it using the same keyboard. You don’t have to dismiss the keyboard just to do all that. To be fair to iOS, you can do that with Braille Screen Input, but the commands are not documented in either Apple’s documentation, or in the VoiceOver settings. In TalkBack settings, those commands are clearly documented and explained.

TalkBack 14 now supports the NLS EReader, which is being freely distributed to NLS patrons. By the end of the year, all 50 states will have the EReader. You do have to connect the display to your phone via USB C, and the cable I had on hand shorted out, so I have to find another one. But I was able to use it with a USB hub, which further made the setup less mobile, but it did work. The commands, though, were rather more complicated than I expected. I had to press Enter with dots 4-5 to move to the next object. Space with Dot 4 was used to move to the next line, and Space with Dot 1 was used to move to the previous line. So I quickly moved back to using the EReader with the iPhone. I’ll practice with it more, but for now it just doesn’t feel as practical as using the EReader, over Bluetooth, on the iPhone, with its simpler commands.

A window into Images

TalkBack 14 has a new screen of choices, where you can enable options regarding image recognition. You have the usual text recognition, and icon recognition, but the screen refers also to “image recognition,” similar to what VoiceOver can do. This is something I’ve wanted for a long time. Some people have a third option, “image descriptions,” but I don’t have that option. Google often rolls out features to a small subset of users, and then rolls it out to everyone else after weeks or months of testing. We’ll have to see how that works out.

Of note, though, is that whenever one gets an iOS update, one gets all the new features right away. There is no rollout of features for VoiceOver, it’s just there. TalkBack 14, as a public release, should have all the features available to everyone at launch, in my oppinion. They could always label image descriptions as “beta.”

The Accessibility Framework

As I’ve said before, the operating system is the root of all accessibility. If the accessibility framework is limited, then apps are limited in what they can do as far as accessibility is concerned. This is why I’ve been so critical of Google, because Android’s accessibility framework, and what apps can communicate to TalkBack, is limited. I’ll give a few examples.

Kindle

I love the books I can get on Kindle. I love that I can read them on just about all of my devices. But not all Kindle apps are created equally. The app on the iPhone is great. Using VoiceOver, I just swipe down with two fingers and the book is read to me. I can move my finger up and down the screen to read by line. I can use a Braille display and just scroll through the book, no turning pages required since it happens automatically. On Android, however, the Kindle app is more limited.

When you open a book in Kindle for Android, you find a page, with a “start continuous reading” button. All this button does is pipe the text of the book out to the Android speech engine. This distinction is important. On iOS, since VoiceOver is controlling things, you can quickly speed up, slow down, pause and resume, or change the voice quickly. On iOS, you can read by word or letter, and most importantly, read easily with a Braille display.

On Android, you can move your finger down the page to hear lines of text, which are sent to TalkBack as announcements. But if you try to have TalkBack read the book, it won’t get passed the current page. The same is even more true with Braille; you have to turn pages manually, using the touch screen because its not actually TalkBack that’s turning the page. So you have to keep touching the phone’s touch screen in order to continue interacting with the app. Braille displays have keys for a reason. You shouldn’t have to use the touch screen to do anything while using a Braille display with your phone. Most Braille display users keep their phone in their pocket while they use it from their displays.

With a lot of other book-reading apps, you can just lock the screen, and just listen to the book. Many blind Android users love that, and find it superior to reading with a screen reader. However, the Kindle app doesn’t even satisfy that. Whenever the screen times out and locks, after that page is finished, the page is turned, but the speech stops. You have to unlock the screen, and then press “start continuous reading” again.

Now, if TalkBack could read the book, and turn the page, the experience would be much better. But Google’s accessibility framework has moved at a glacial pace, throughout the ten or fifteen years of Android, and iOS, development. While Apple opened up API’s to developers, so that VoiceOver could turn pages while reading, Google has not even added that feature to their own reading app. Instead, Play Books uses a web view, and just detects when the user has gone beyond the last element on the page, and then just turns the page. At least, that’s what I think is happening. I obviously don’t have access to the source code of the Play Books app.

Mortal Kombat

Games are becoming more and more important in the mobile ecosystem. Mobile games are, in some cases, more popular than console games. But mobile games are sometimes very hard to make accessible. Take the Mortal Kombat game. You have an interface where you choose a game mode, make a team of fighters, upgrade cards, and change settings. Then, you have the fight mode, where you tap to attack, swipe for a special attack, and hold two fingers on the screen to block. On iOS, the developers have made the buttons visible to VoiceOver, and added labels to them. They’ve shown the text elements, where you “tap to continue”, to VoiceOver, and allowed the double tap to advance to the next screen. That part, I believe, could be done on Android as well.

The real fun is in the battles, though. Once a fight starts, on iOS, VoiceOver is pushed out of the way, so to speak, by a direct touch area. This allows taps and swipes to be sent directly to the app, so that I can play the game. While I’m fighting, though, the game sends text prompts to VoiceOver, like “swipe up,” or “Tap when line is in the middle.” I’m not sure exactly what the last one means, but “swipe up” is simple enough. This allows me to play, and win, battles.

Unfortunately for Android users, though, this “direct touch area” is not possible. Google has not added this feature for app developers to take advantage of. They theoretically could, but they’d then have to make an accessibility service for the app, and then make sure that the service is running when the app runs. Users are not going to turn on an accessibility service for a game, and developers are not going to spend time dealing with all that for the few blind people, relatively speaking, on Android.

Catching the Apple

Google, for the last few years, has been trying hard to catch up to Apple. They have a long way to go. Apple, however, hasn’t stayed still. They have a decade worth of built-up experience, code, frameworks, and blind people who, each time they try Android, find that it falls short and come back to iOS. I’m not saying Apple is perfect. And each time a wave of blind people try Android, a few find that it works for what they need a phone for.

As more and more blind people lean into using a phone as their primary computing device, or even their secondary computing device, accessibility is going to be more important than ever. We can’t afford half-baked solutions. We can’t afford stopgap measures. Companies who build their services on top of these platforms will do what they can to make their apps accessible, but they can only do so much. In order to make better apps, developers need rich, robust API’s and frameworks. And right now, that’s Apple. And I’ve gotten tired of holding my breath for Google. I’m just going to let out that breath and move on. I’ll probably keep my Android phone around, but I’m not going to use it as my primary device until Google gets their act together.

Some Android users will say that I’m being too harsh, that I’m not giving Google enough time, or that I’m being whiny, or radical, or militant. But it took Google ten or so years to add commands that used more than one finger. It took them ten years to add Braille support to their screen reader. It took them ten years to add spell checking. I’m not going to wait another ten years for them to catch up to where Apple was a good three years ago.

Devin Prater @devinprater