Google

TalkBack 14: Rushed Steps in the Right Direction, but still so far behind

I’ve talked a lot about Android on this blog. I love the idea, and the operating system is nice, but accessibility could be so much more. I’ve had my Galaxy S20 FE (5G), for about a year and a half or so. I’ve seen the changes from Android 11, to 12, and finally to 13. TalkBack has improved steadily over that year and a half, adding Braille support, which I thought wouldn’t come for another five to ten years. Spell checking was added in TalkBack 13. Text recognition in images and unlabeled buttons was added in TalkBack 12.1. Icon descriptions were added in TalkBack 13.

In this article, though, I’ll overview the changes in TalkBack 14, the ones I have access to, that is. I’ll get to that later. I’ll also talk about the problems facing Android that isn’t really about TalkBack, but is more about the accessibility framework, and what apps can and can’t do. So this will be a sort of continuation from my other Android articles, more than just a “what’s new in TalkBack” style article.

TalkBack 14, Lots of Braille

TalkBack 14 is a good iteration of where TalkBack 13 started. TalkBack now has many more commands, both for Braille displays and for its Braille keyboard. One can now move around an edit field by words and lines, not just characters, using the onscreen Braille keyboard. One can also select text, and copy and paste it using the same keyboard. You don’t have to dismiss the keyboard just to do all that. To be fair to iOS, you can do that with Braille Screen Input, but the commands are not documented in either Apple’s documentation, or in the VoiceOver settings. In TalkBack settings, those commands are clearly documented and explained.

TalkBack 14 now supports the NLS EReader, which is being freely distributed to NLS patrons. By the end of the year, all 50 states will have the EReader. You do have to connect the display to your phone via USB C, and the cable I had on hand shorted out, so I have to find another one. But I was able to use it with a USB hub, which further made the setup less mobile, but it did work. The commands, though, were rather more complicated than I expected. I had to press Enter with dots 4-5 to move to the next object. Space with Dot 4 was used to move to the next line, and Space with Dot 1 was used to move to the previous line. So I quickly moved back to using the EReader with the iPhone. I’ll practice with it more, but for now it just doesn’t feel as practical as using the EReader, over Bluetooth, on the iPhone, with its simpler commands.

A window into Images

TalkBack 14 has a new screen of choices, where you can enable options regarding image recognition. You have the usual text recognition, and icon recognition, but the screen refers also to “image recognition,” similar to what VoiceOver can do. This is something I’ve wanted for a long time. Some people have a third option, “image descriptions,” but I don’t have that option. Google often rolls out features to a small subset of users, and then rolls it out to everyone else after weeks or months of testing. We’ll have to see how that works out.

Of note, though, is that whenever one gets an iOS update, one gets all the new features right away. There is no rollout of features for VoiceOver, it’s just there. TalkBack 14, as a public release, should have all the features available to everyone at launch, in my oppinion. They could always label image descriptions as “beta.”

The Accessibility Framework

As I’ve said before, the operating system is the root of all accessibility. If the accessibility framework is limited, then apps are limited in what they can do as far as accessibility is concerned. This is why I’ve been so critical of Google, because Android’s accessibility framework, and what apps can communicate to TalkBack, is limited. I’ll give a few examples.

Kindle

I love the books I can get on Kindle. I love that I can read them on just about all of my devices. But not all Kindle apps are created equally. The app on the iPhone is great. Using VoiceOver, I just swipe down with two fingers and the book is read to me. I can move my finger up and down the screen to read by line. I can use a Braille display and just scroll through the book, no turning pages required since it happens automatically. On Android, however, the Kindle app is more limited.

When you open a book in Kindle for Android, you find a page, with a “start continuous reading” button. All this button does is pipe the text of the book out to the Android speech engine. This distinction is important. On iOS, since VoiceOver is controlling things, you can quickly speed up, slow down, pause and resume, or change the voice quickly. On iOS, you can read by word or letter, and most importantly, read easily with a Braille display.

On Android, you can move your finger down the page to hear lines of text, which are sent to TalkBack as announcements. But if you try to have TalkBack read the book, it won’t get passed the current page. The same is even more true with Braille; you have to turn pages manually, using the touch screen because its not actually TalkBack that’s turning the page. So you have to keep touching the phone’s touch screen in order to continue interacting with the app. Braille displays have keys for a reason. You shouldn’t have to use the touch screen to do anything while using a Braille display with your phone. Most Braille display users keep their phone in their pocket while they use it from their displays.

With a lot of other book-reading apps, you can just lock the screen, and just listen to the book. Many blind Android users love that, and find it superior to reading with a screen reader. However, the Kindle app doesn’t even satisfy that. Whenever the screen times out and locks, after that page is finished, the page is turned, but the speech stops. You have to unlock the screen, and then press “start continuous reading” again.

Now, if TalkBack could read the book, and turn the page, the experience would be much better. But Google’s accessibility framework has moved at a glacial pace, throughout the ten or fifteen years of Android, and iOS, development. While Apple opened up API’s to developers, so that VoiceOver could turn pages while reading, Google has not even added that feature to their own reading app. Instead, Play Books uses a web view, and just detects when the user has gone beyond the last element on the page, and then just turns the page. At least, that’s what I think is happening. I obviously don’t have access to the source code of the Play Books app.

Mortal Kombat

Games are becoming more and more important in the mobile ecosystem. Mobile games are, in some cases, more popular than console games. But mobile games are sometimes very hard to make accessible. Take the Mortal Kombat game. You have an interface where you choose a game mode, make a team of fighters, upgrade cards, and change settings. Then, you have the fight mode, where you tap to attack, swipe for a special attack, and hold two fingers on the screen to block. On iOS, the developers have made the buttons visible to VoiceOver, and added labels to them. They’ve shown the text elements, where you “tap to continue”, to VoiceOver, and allowed the double tap to advance to the next screen. That part, I believe, could be done on Android as well.

The real fun is in the battles, though. Once a fight starts, on iOS, VoiceOver is pushed out of the way, so to speak, by a direct touch area. This allows taps and swipes to be sent directly to the app, so that I can play the game. While I’m fighting, though, the game sends text prompts to VoiceOver, like “swipe up,” or “Tap when line is in the middle.” I’m not sure exactly what the last one means, but “swipe up” is simple enough. This allows me to play, and win, battles.

Unfortunately for Android users, though, this “direct touch area” is not possible. Google has not added this feature for app developers to take advantage of. They theoretically could, but they’d then have to make an accessibility service for the app, and then make sure that the service is running when the app runs. Users are not going to turn on an accessibility service for a game, and developers are not going to spend time dealing with all that for the few blind people, relatively speaking, on Android.

Catching the Apple

Google, for the last few years, has been trying hard to catch up to Apple. They have a long way to go. Apple, however, hasn’t stayed still. They have a decade worth of built-up experience, code, frameworks, and blind people who, each time they try Android, find that it falls short and come back to iOS. I’m not saying Apple is perfect. And each time a wave of blind people try Android, a few find that it works for what they need a phone for.

As more and more blind people lean into using a phone as their primary computing device, or even their secondary computing device, accessibility is going to be more important than ever. We can’t afford half-baked solutions. We can’t afford stopgap measures. Companies who build their services on top of these platforms will do what they can to make their apps accessible, but they can only do so much. In order to make better apps, developers need rich, robust API’s and frameworks. And right now, that’s Apple. And I’ve gotten tired of holding my breath for Google. I’m just going to let out that breath and move on. I’ll probably keep my Android phone around, but I’m not going to use it as my primary device until Google gets their act together.

Some Android users will say that I’m being too harsh, that I’m not giving Google enough time, or that I’m being whiny, or radical, or militant. But it took Google ten or so years to add commands that used more than one finger. It took them ten years to add Braille support to their screen reader. It took them ten years to add spell checking. I’m not going to wait another ten years for them to catch up to where Apple was a good three years ago.

Categories: accessibility Android blindness Google iPhone productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/07/17/talkback-rushed-steps.html" 

		

Params

		map[categories:[Google accessibility blindness Android productivity iPhone] date:2023-07-17 04:18:09 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/07/17/talkback-rushed-steps.html iscjklanguage:%!s(bool=false) lastmod:2023-07-17 04:18:09 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3405931) publishdate:2023-07-17 04:18:09 -0600 -0600 title:TalkBack 14: Rushed Steps in the Right Direction, but still so far behind type:post url:/2023/07/17/talkback-rushed-steps.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc8ea0), (*hugolib.pageOutput)(0xc000bc8fc0), (*hugolib.pageOutput)(0xc000bc90e0), (*hugolib.pageOutput)(0xc000bc9200), (*hugolib.pageOutput)(0xc000bc9320), (*hugolib.pageOutput)(0xc000bc9440), (*hugolib.pageOutput)(0xc000bc9560), (*hugolib.pageOutput)(0xc000bc9680)}, pageOutput:(*hugolib.pageOutput)(0xc000bc8ea0), pageCommon:(*hugolib.pageCommon)(0xc00078ef00)}	
		

The Rings of Google's Trusted Testers program

Over the years, I’ve owned a few Android devices. From the Samsung Stratosphere to the Pixel 1 to the Braille Note Touch, and now the Galaxy S20 FE 5G. I remember the eyes-free Google group, where TalkBack developers were among us mere mortals. I remember being in the old TalkBack beta program. I remember when anyone in the Eyes-free group could be in the beta. And now, that is no longer the case.

In this post, I’ll talk about the Accessibility Trusted Testers program, how it works in practice, in my own experience, and how this isn’t helpful for both TalkBack as a screen reader, and Google’s image as a responsive, responsible, and open provider of accessibility technology. In this article, I will not name names, because I don’t think the state of things results from individual members of the TalkBack or accessibility team. And as I’ve said before, these are my experiences. Someone who is more connected or famous in the blind community will most certainly have better results.

The outer ring

When you open the link to join the accessibility trusted tester program, you’ll find this text:

Participate in early product development and user research studies from select countries.

After signing up, you’ll get an email welcoming you to the program. Afterwards, you get emails about surveys and sessions you can do. This isn’t just for blind people, either. There are a lot of sessions done for people with visually impaired people, Deaf people, and wheelchair users. And yes, there are a lot more of them than blind people. A good many of them require that you be near a Google office, so require transportation. I won’t go into detail about what the sessions and surveys are about, but this overview should give you a good enough idea.

The Inner Ring

Now we get into the stuff that I take issue with. There is no way for someone not in the loop to know. If you contact someone in the accessibility team at Google, you can ask to be placed in the TalkBack and/or Lookout testing programs. Depending on who you ask, you may or may not get any response at all. Afterwards, the process may get stuck in a few places, either in searching for you in the program, calling out to another person, and so on. And no, I’m not in either private beta program. The last time I’ve heard from them is two months ago now.

The things I have issues with are many, and I’ll go over them. First, when someone signs up for these trusted tester programs, they think that, because it’s a “tester” program, you’ll gain access to beta versions of TalkBack and so on. You don’t.

Second, some of these sessions require you to travel to Google’s offices. There are blind people scattered across states and countries and provinces, and few Google offices. So, if a blind person wants to attend a session, they’ll have to travel to California to do so. And that means that only Californian blind people, who are in the program, will even know about the study and attend.

And third, the biggest, is this. When the program opened up after the demolition of the eyes-free group, the people using Android for the longest flooded in. So, throughout all these years, it’s been them, the people used to use Android, providing the feedback. People who haven’t used iOS in years, people who don’t care about images and who have found their preferred apps and stick with that. So, when new people come to Android, the older users have a bunch of third-party apps, for email, messaging, launchers, and so on. Sure, the new users can talk about how the first-party experience is on the Blind Android Users mailing list and Telegram group, but the older users always have some third-party way of doing things, or a workaround of “use headphones” or “mute TalkBack” or “use another screen reader” or “Go back to the iPhone”. And I’ve nearly had enough of that. Sighted people don’t have to download a whole other mail client, or mute TalkBack while talking to Google Assistant, or use a third-party Braille driver like BRLTTY, or use and iPhone to read Kindle books well in Braille or talk to the voice assistant without being talked over.

Also, the Trusted Testers program only covers the US and maybe Canada. Most blind Android users are from many other countries. So, their voices are, for all intents and purposes, muted. All those devices that they use, the TalkBack beta program will not catch. A great example of this is spell checking released in TalkBack 13.1. On Pixels, when you choose a correction, that word is spelled out. On Samsung and other phones, it’s not. It makes me wonder what else I’m missing by using a non-Google phone. And that’s not how Android is supposed to work. If we, now, have to buy Google phones to get the best accessibility, how is that better than Apple, where we have to buy Pro iPhones to get the most and best features?

How this can be Fixed

Google has a method by which, in the Play Store, one can get the beta version of an app. Google can use this for TalkBack and Lookout. There is absolutely nothing stopping them from doing this. Google could also release source code for the latest TalkBack builds, including beta and alpha builds, and just have users build at their own risk. Google could open the beta programs to everyone who wants to leave feedback and help. After all, it’s not just Google phones that people use. And the majority of blind people don’t use Pixel phones. blind people also have spaces for talking about Android accessibility, primarily the Blind Android users list and Telegram group. I’d love to see Google employees hanging out there, from the TalkBack team to the Assistant team, the Bard team, and the Gmail and YouTube teams. Then we could all collaborate together on things like using TalkBack actions in YouTube, moving throughout a thread of messages in Gmail, and having TalkBack not speak over someone talking to the assistant, with or without headphones in.

How can I help

If you’re working at Google, talk to people about this. Talk to your team, your manager, and so on. If you know people working at Google, talk to them. Ask them why all this is. Ask them to open up a little, for the benefit of users and their products, especially accessibility tools. If you’re an Android user, talk to the accessibility folks about it. If you’re at a convention where they are, ask them about this. If you’re not, they’ve listed their email addresses. I want anyone who wants to be able to make Android accessibility, and TalkBack, the best that it can be, to be able to use the latest software, use beta software, provide feedback directly to the people making it. Google doesn’t need to be another Apple. Even Apple provides beta access, through iOS betas, to any eligible iPhone. Since Samsung barely does any TalkBack updates until half a year or more later, it’s seriously up to Google to move this forward. I’ve known people who plug their phone into a docking station, and use it as a computer. I want blind people to be able to do that.

In order to move this forward, though, we need to push for it. We need to let Google know that a few people who have been using Android for the past 10 years isn’t enough. We need to let them know that there are more countries than the United States and Canada. We need to let them know that we want to work with them, to collaborate with them, not for them to tell us what we want through a loud minority.

TalkBack doesn’t have as many options and features as Voiceover, but it’s started out on solid ground. ChromeVox doesn’t have as many options and features as JAWS but has started out on a solid foundation. Together, though, the community and Google can make both of these platforms, with the openness of Android, on both phones and ChromeBooks, and Linux containers on chromeBooks, the best platforms they can be! All it takes is communication!

Categories: accessibility Android blindness Google productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/07/02/the-rings-of.html" 

		

Params

		map[categories:[Google accessibility blindness Android productivity] date:2023-07-02 14:11:25 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/07/02/the-rings-of.html iscjklanguage:%!s(bool=false) lastmod:2023-07-02 14:11:25 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3315799) publishdate:2023-07-02 14:11:25 -0600 -0600 title:The Rings of Google's Trusted Testers program type:post url:/2023/07/02/the-rings-of.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc3c20), (*hugolib.pageOutput)(0xc000bc3d40), (*hugolib.pageOutput)(0xc000bc3e60), (*hugolib.pageOutput)(0xc000bc8000), (*hugolib.pageOutput)(0xc000bc8120), (*hugolib.pageOutput)(0xc000bc8240), (*hugolib.pageOutput)(0xc000bc8360), (*hugolib.pageOutput)(0xc000bc8480)}, pageOutput:(*hugolib.pageOutput)(0xc000bc3c20), pageCommon:(*hugolib.pageCommon)(0xc00078e500)}	
		

Google: Full Speed Ahead

For years now, Google has been seen, for good reasons I’d say, as moving very slowly with accessibility. TalkBack would get updates in fits and starts, but otherwise didn’t seem to have people that could devote much time to it. Starting a few years ago with multi-finger gestures, TalkBack development began picking up steam, and to my surprise and delight and relief, it has not slowed down. They seem to spend as much time resolving issues as they spend creating new features and experiences. This was highlighted in the new TalkBack update that began rollout on January 9.

On that day, there was a TalkBack update from Google (not Samsung) which bumped the version to TalkBack 13.1. New in this version is the ability to use your HID Braille display over USB. Support for Bluetooth will come when Android has Bluetooth drivers for them. That alone is worth an update. But there’s more! New in TalkBack is the ability to spell check messages, notes, and documents. That alone was worth two major iOS updates to complete. But there’s more! Now, we can use actions the same way iOS does. That alone would have been worth several updates. Now, we have many more languages available for Braille users. We can now switch the direction of panning buttons. On the Focus braille display, the right whiz-wheel type buttons now pan, giving two ways to pan text. We can now move from one container to another, just like in iOS.

Now, I know that was a lot of info, in just a minor version bump. So let’s unpack things a bit. I’ll describe the new features, and why they impress me a lot more than Apple’s latest offerings.

HID Braille over USB

When TalkBack’s Braille support was shown off last year, there was a lot of talk about the displays that were left out. Displays from Humanware, which use the Braille HID standard, were not included on the list. That was mainly because there are no Android Bluetooth drivers for the displays, meaning TalkBack can’t do anything with them, over Bluetooth. However, with this update, people who have these displays, like the NLS EReader from Humanware, can plug their displays into their phone through a USB-C cable. This is made much easier because the displays work through USB-C anyway, and use them with TalkBack. This is made even simpler because Android phones already use USB-C, so you don’t need an adaptor to plug your display into your phone.

This demonstrates two things, to me. First, the TalkBack team is willing to do as much as they can to support these new displays and the new standard. I’m sure they’re doing all they can to work with the Bluetooth team to get a driver made into Android 14 or 15. Second, even if the wider Android team doesn’t have something ready, the accessibility team will do whatever they can to get something to work. Since Braille is important, they released USB support for these displays now, rather than waiting for Bluetooth support later. But when they get Bluetooth support, adding that support for these displays should be easier and quicker.

Now, TalkBack’s Braille support isn’t perfect, as we’ll see soon, but when you’re walking down a path, steps are what matters. And walking forward slowly is so much better than running and falling several times and getting bugs and dirt all over you.

Spellchecking is finally here!

One day, I want to be able to use my phone as my only computing device. I would like to use it for playing games, writing blog posts like this one, web browsing, email, note-taking, everything at work, coding, learning to code, and Linux stuff. While iOS’ VoiceOver has better app support from the likes of Ulysses and such, Android is building what could ultimately provide many developers a reason to support accessibility. Another brick was just put into place, the ability to spell check.

This uses two new areas of TalkBack’s “reading controls”, a new control from which to check for spelling errors, and the new Actions control to correct the misspelling. It works best if you start from the top of a file or text field. You switch the reading control to the “Spell check” option, swipe up or down to find a misspelled word, then change the control to “actions” and choose a correction. iOS users may then say “Well yeah I can do that too”. But that’s the point. We can now even more clearly make the choice of iPhone or Android, not based on “Can I get stuff done?” but on “How much do I want to do with my phone?” and “How much control do I want over the experience?” This is all about leveling the field between the two systems, and letting blind people decide what they like, more than what they need.

Actions become instant

From what I have seen, the iPhone has always had actions. VoiceOver users could always delete an email, dismiss notifications, and reschedule reminders with the Actions rotor, where a user can swipe up or down with one finger to select an option, then double tap to activate that option. This allows blind people to perform swipe actions, like deleting a message, liking a post, boosting a toot, or going to a video’s channel. Android had them too, they were just in an Actions menu. Unless you assigned a command to it, you had to open the TalkBack menu, double tap on Actions, find the action you wanted, and then double tap. Here are the steps for a new Android user, who has not customized the commands, to dismiss a notification through the Actions menu:

Now, with the new Actions reading control, here’s how the same user will dismiss a notification:

This action is one that users perform hundreds of times per day. This essential task has been taken down from five steps, to three. And, with TalkBack’s excellent focus management, once you dismiss a notification, TalkBack immediately begins speaking the next one. So to dismiss the next one, you just swipe up with one finger, then double tap again. It’s effortless, quick, and is delightfully responsive.

On Android, since actions have been rather hidden for users, developers haven’t always put them into their app. Of course, not every app needs them, but it would help apps like YouTube, YouTube Music, Facebook, GoodReads, PocketCasts, Google Messages, WhatsApp, Walmart, and Podcast Addict, to name a few. It will take some time for word of this new ability to spread around the Android developer space. For Android developers who may be reading this, please refer to this section on adding accessibility actions. That entire page is a great resource for creating accessible apps. It describes things clearly and gives examples of using those sections in code.

Interestingly, the other method of accessing actions is still around. If you have an app, like Tusky, which has many actions, and you want to access one at the end of the list, you can still open the Actions menu, find the action you want, and double tap. In Android, we have options.

New Languages and Braille features

One of the critical feedback from users of Braille support is that there were only about four languages supported. Now, besides a few like Japanese and Esperanto, we have many languages supported. One can add new Braille languages or remove them, like Braille tables in iOS, except everyone knows what a language, in this instance, means, but very few know what a Braille table is. That goes into the sometimes very technical language that blindness companies use in their products, from “radio button” to “verbosity” which I should write about in the future. For now, though, Google named its stuff right, in my opinion.

In the advanced screen of Braille settings, you can now reverse the direction of panning buttons. I never liked this, but if someone else does, it’s there. You can also have Braille shown on the screen, for sighted users or developers.

For now, though, if you choose American English Braille, instead of Unified English Braille, you can only use Grade one Braille, and not Grade two. However, computer Braille is now an option, so you can finally read NLS BARD Braille books, or code in Braille, on your phone. This brings textual reading a step closer on Android!

Faster and cleaner

Speed matters. Bug fixes matter. In TalkBack 13.1, Google gave us both. TalkBack, especially while writing in the Braille onscreen keyboard, is somehow even more snappy than before. That bug where if you paused speech, TalkBack from then on couldn’t read passed one line of a multi-line item, is gone. TalkBack now reads the time, all the time, when you wake up your phone as the first thing it says.

Meanwhile, if I have VoiceOver start reading a page down from the current position, it stops speaking for no reason. iOS feels old and sluggish, and I don’t feel like I can trust it to keep up with me. And I just want Apple to focus on fixing its bugs rather than working on new features. They spent resources on technology like that DotPad they were so excited about, but no blind people have this device, while their tried and true Braille display support suffers. Yeah, I’m still a bit mad about that.

The key takeaway from this section is that perhaps real innovation is when you can push out features without breaking as much stuff as you add. For blind people, a screen reader isn’t just a cool feature, or a way to look kind in the media, or a way to help out a small business with cool new tech. It’s a tool that had better be ready to do its job. Blind people rely on this technology. It’s not a fun side project, it’s not a brain experiment. It’s very practical work, that requires care for, often, people who are not like you.

Luckily, Google has blind people that work for them. And, if the past year has shown an example, they’re finally getting the resources, or attention, they need to really address customer feedback and provide blind Android users with what will make Android a great system to use.

Categories: accessibility Android blindness Google

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/01/13/google-full-speed.html" 

		

Params

		map[categories:[Google accessibility blindness Android] date:2023-01-13 00:10:41 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/01/13/google-full-speed.html iscjklanguage:%!s(bool=false) lastmod:2023-01-13 00:10:41 -0600 -0600 layout:post linkedin:map[id:%!s(int=7019546263968722945) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109680410417422268) username:devinprater] medium:map[id:a60e423e3db5 username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1783044) publishdate:2023-01-13 00:10:41 -0600 -0600 title:Google: Full Speed Ahead type:post url:/2023/01/13/google-full-speed.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb97a0), (*hugolib.pageOutput)(0xc000bb98c0), (*hugolib.pageOutput)(0xc000bb99e0), (*hugolib.pageOutput)(0xc000bb9b00), (*hugolib.pageOutput)(0xc000bb9c20), (*hugolib.pageOutput)(0xc000bb9d40), (*hugolib.pageOutput)(0xc000bb9e60), (*hugolib.pageOutput)(0xc000bc2000)}, pageOutput:(*hugolib.pageOutput)(0xc000bb97a0), pageCommon:(*hugolib.pageCommon)(0xc000634f00)}