The Right kind of Blind

I know I really shouldn’t write this, but at this point it doesn’t matter. I’ve ran away from those who will most likely look upon this with disdain, as if I don’t look upon myself with even more than those on the outside can. I was reading a post from someone who mentioned “the right kind of disabled.” That struck something in me that I’d been looking for for a long time. I thought I was the only one. Maybe I am. But it needs telling anyway, so why not.

Acceptable disability

You know those blind people who can get around an area they’ve barely been to, or cook on a stove, or even communicate well with body language and such? I’m sure many in the blind community know of, or are one of these. These blind people don’t need much. They don’t ask for help, they know their way around, and workplaces are proud to have the acceptable kind of person with disabilities, because they barely notice that the person is blind.

And that’s the key, isn’t it? The person that can hide their disability so well that no one really stops to think about it.

The wrong kind of blind

Yep, it’s one of those blog posts. If you want something more cheerful and full of Christmas bullshit, there are lots of Amazon sales going on I’m sure. So, I’ve been blind since birth. You’d think I’d be adjusted to it and know how to do every little thing. You’d think I’d be one of those web accessibility experts by now, talking about Alt text and ripping web devs a new one because they used a div instead of semantic HTML. But nope. I’m the other kind of blind person. The kind the media doesn’t want to talk about. The kind organizations are ashamed of. The kind they had to put into a corner and hope he doesn’t rock the boat.

Nope. Instead of web accessibility, I’m all into OS and app accessibility. Instead of bravely taking on my problems, I run from them. All the way away from them. Yes, that’s why I left Mastodon. No, I didn’t kill myself; I’m far too cowardly for that. I’m the weird one that’s all into text formatting, Markdown, and understanding things like that which almost no other blind person cares about. I’ve tried Seeing with Sound. I use AI. I feel passionate about free open source software, even though I keep those feelings locked up tight so I’m not too disappointed in it.

While just about all blind people I know love parties, loud noises are sometimes physically painful. If I hear a pot clang, or a door slam, it’s like a stinging feeling in my arms or shoulders. So I stay home and chill. I joke to the people around me that I’m the most chill person they’ll meet. But it’s sadly not a joke. Everyone else wants to go somewhere, or do something, or get out of the house. My house is my vacation.

So what?

Being the way I am comes with a ton of downsides. I’m not as quick or persevering as the acceptable blind person. I was once in a meeting to help make an operating system more accessible, in which we needed to open a document and work on it. It was using a web app that the organizers said would be accessible. I had issues with it, and so I spent the five minutes we had, trying to find accessibility documentation for the web editor. There was none. And then when I asked about it everyone was all quiet like it was an intrusion and I shouldn’t be there and why was I even trying. And then the people seemed to shrug their shoulders and go on with the meeting. I can never forget that.

Side note: If any corporation, company, or organization says they take accessibility seriously, or that they put accessibility front and center, they’re most likely lying to you, and mocking you cause who dares tell someone they’re not doing enough in “the good fight?” Lol, Microsoft. We know you. Windows 11 speaks for itself.

And now, at this point, I’d give tips on how you, yes, you, can help the unaccepted blind. But I won’t, mainly because it’s all over every other blog post about dealing with neuro-divergent people. Listen, then act. Of course, the listening part is really, really important. But then acting is what’s actually going to make the person trust that you actually give even a single damn. Lots of companies, organizations, and nonprofits are great at listening. They have issue trackers, support lines, ticket systems, help desks, and open-door policies. But then when it comes to acting, well… We need more people to make us do it, we need more feedback, maybe next year, this should be sent to IT instead because I can’t actually help you, RIM is too expensive; see if you can make a deal or cut down the price for us, we’ll make a committee for this as a part of DEI even though they have lots of other stuff and none of these people are blind so yeah good luck haha, well we should do it but it’s hard, we don’t want to do it because other software should do it for us. Just kinda try not to lean on these excuses, and you’ll be good. And if all you’ve got left is “We can’t help with accessibility,” then just tell us. Tell us you don’t care enough, or can’t do it, or have tons of other priorities, so we can go somewhere else.

Categories: accessibility blindness

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/12/25/the-right-kind.html" 

		

Params

		map[categories:[accessibility blindness] date:2023-12-25 14:29:50 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/12/25/the-right-kind.html iscjklanguage:%!s(bool=false) lastmod:2023-12-25 14:29:50 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3810294) publishdate:2023-12-25 14:29:50 -0600 -0600 title:The Right kind of Blind type:post url:/2023/12/25/the-right-kind.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c20900), (*hugolib.pageOutput)(0xc000c20a20), (*hugolib.pageOutput)(0xc000c20b40), (*hugolib.pageOutput)(0xc000c20c60), (*hugolib.pageOutput)(0xc000c20d80), (*hugolib.pageOutput)(0xc000c20ea0), (*hugolib.pageOutput)(0xc000c20fc0), (*hugolib.pageOutput)(0xc000c210e0)}, pageOutput:(*hugolib.pageOutput)(0xc000c20900), pageCommon:(*hugolib.pageCommon)(0xc00078f900)}	
		

Contextual Accessibility

I’m sure all this has been said many times before, but all I’ve been able to find so far is this article on Alt-Text which isn’t what I’m going for here. As a side note , I get that most members of society can only improve accessibility by adding Alt-text to the visual media they create, but isn’t it time we move a bit passed just image descriptions? Guides for Alt-text are everywhere. I am not about to make yet another. This will, however, be a longer article than the usual, and I wouldn’t blame you for coming up for air during this one.

In this article, I’ll talk about accessibility, for blind people, not just in regards to the screen reader, which also gets talked about, but also the operating system, and the activity the blind person is trying to do. In writing this article, I hope to zoom out a bit and show developers and others interested, that accessibility encompasses everything about the device, operating system, accessibility stack, screen reader, app, and person using the device. There is no solitary part of that stack.

How to introduce a screen reader

I know this is going to be an edge case. But honestly, people in the accessibility space should be used to edge cases by now, and this happens much more than you’d think. We have our blind person here: a person who, up to now, has been fully sighted, but lost her sight instantly due to an accident. No, I’m not going into details because there are tons of ways a person can lose their vision. No, this isn’t a real person, and I’m not, in any way, saying that this is how instant sight loss occurs or effects people, but I know that if I lost my sight this way, this is probably what would happen. And I know I’m not the only one that struggles with depression and anxiety.

So this person, let’s name her Anna, is trying to get used to having an entire sense, just gone. Maybe she’s in pain, or maybe her mind is numb in trying to process everything. Or maybe she’s about to give up and find a sharp knife in which to end it all, searching and searching in the now permanent dark. But she remembers learning about Hellen Keller in school. Thinking back though, she didn’t remember covering any more than her childhood. Anna wants to find a book on the rest of Keller’s life, so stores that in the back of her mind, with a note to figure out how to get an audiobook.

Now, you may be considering this amount of personal story to be a bit much, or over the top, or inappropriate for an informational post. This honestly makes blind people into nothing more than a test subject to be observed, and understood on a sort of acquaintance basis. I, for one, am tired of that. I, and every other blind person, am a living, thinking, feeling person. Notice that I didn’t say “blind user” above? No. A user is some case number or hypothetical list of assumptions to be grabbed, accounted for, and discarded at the end. But a person, well, they’re alive. They deal with their everyday trials and triumphs. They breathe and speak and change and grow and learn. A person is a bit less easy to discard, or to mock as “just another ‘luser’”. And yes, I’ve had to learn this too, over my years of teaching blind people

Now, Anna got some help with her mental health, and she’s trying to learn to deal with being blind. She’s learned to feel for the light switch, to see if a light is on or not, since she can’t see them. She’s learned that she needs to keep everything in order, so that she can find things much easier. She’s even started making her own sandwiches, ham and cheese and some mayo, and pouring some cereal, even though she makes a bit of a mess with the milk. She now keeps a towel on the counter, for that reason.

But today, she’s gonna do something that she thinks would be impossible. She feels around on her nightstand for the slab of glass. Her iPhone. She hasn’t touched it since she went blind a week ago. Or has it been a few weeks? A month? She takes a deep breath, and lets it out. She breathes for a moment, then starts to examine the device. A flat screen with two horizontal lines for the earpiece. Buttons on both sides. A flat piece of glass with the camera on the back. You get the idea.

She then decides to try using Siri. Siri wasn’t especially useful to her before, but it’s all she has now. She tries some simple requests, like asking for the time, or the weather. She then decides to be a little more brave. Taking a deep breath, she asks “How can a blind person use an iPhone?” Siri responds, “Here’s what I found on the web. Check it out!”

Anna waits a moment, remembering what this screen looks like. She waits, thinking that maybe Siri would be so kind as to read it out to her. Siri, it turns out, is not so kind. Her room is silent. Tears to not make a sound as they fall.

She sits there for a moment, trying to pull herself together. Trying to gather up the pieces that had just went flying everywhere. She breathes in, and breathes out. She grips the phone in her hands, and tries the last thing she can think of. Because if this doesn’t work, then this expensive phone is nothing but expense to her. She tells Siri to call her mother. “Okay, calling Mom.” Her heart stops to listen. Her mind strains to hear. And the phone finally rings. And, after a moment, she hears her mother’s voice.


Now, this is without any kind of outside agency helping her, and I shudder to think of how many blind people go through this. They use Siri to text and call, read their texts and notifications, create a note and read it out, and set alarms. But apps, well, Youtube and books and anything besides texting, those are lost to them. Yes, having the ability to make a call or send a text is really important. And in this case, it’s probably what made the difference between barely surviving to possibly thriving. Because I’ve known people who come in with their iPhone and only know Siri. And maybe they’ve tried to use VoiceOver. But here’s where that falls apart.


Anna waited for the doctor to hang up the phone. She hated this part. The doctor hung up, but the automated system on which the call happened, well it takes a minute. Finally, she heard the static and click. She’d been noticing more of that kind of thing. How the tone of the milk changed as she filled a cup, or the way her foot falls echoed when she was near a door. She’d kind of started making sounds, clapping her hands or stepping a bit louder, just to experiment with the sounds. Not too much though. She never knew if someone was looking in through the window at that weird blind girl clapping at nothing or stomping her feet.

Anna smiles at the memories, but reminds herself that she’s got a thing to try. Through some miracle, her mom had found an app on the iPhone, called VoiceOver. It was supposed to read out what is on the screen to you, and guide you through tasks on it. So, she told Siri to “Turn on VoiceOver.”

Immediately, VoiceOver turned on. She heard a voice say “VoiceOver on.” And then it read the first app on the Home Screen. She was impressed, but wasn’t sure what to do next. The voice then said “double tap to open.” She tapped the screen twice. It read another app on the screen… Twice.

She continued playing with it, but couldn’t make much sense of it. She could feel around the screen, but then what? When she tapped on an item to open it, it just read that item. She tapped twice like it said, but it just read the item again. In frustration, after hearing “double tap to open” for the hundredth time, she quickly jabbed at the phone. Miraculously, the app opened. She had been touching the weather app. She felt around the screen, hearing about, well, the weather. Exasperated, she put the phone down and took a nap. She deserved it.


And here we have the meat of the issue. Yes, I tried that question to Siri, about how a blind person would use an iPhone, and she gave that “here’s what I found on the web” answer. This is even on iOS 17 beta, where Siri is supposed to be able to read webpages. But Siri can’t even ask the person if Siri should read the page or not. Because that’s just too hard.

Siri could have easily been programmed to tell the person about VoiceOver, and at least give tips on how to use it. But no. If there’s a screen, people inherently can view the screen, and a digital talking assistant shouldn’t get in the sighted person’s way. Right?

So, let’s look at accessibility here, in the context of a voice assistant. The voice assistant should never, ever assume that a person can see the screen. Like, ever. If it’s pulled up a long article, the assistant should offer to read the article. It doesn’t matter if a screen reader is on or not. As you read above, Anna didn’t even know the concept of a screen reader, let alone that one is on the iPhone.

Mobile phones have an amazing accessibility potential specifically because of the voice assistant. This voice assistant is in popular culture already, so a newly blind person can at lease use Siri, or Google Assistant, or even Bixby.

So, an assistant should be able to help a blind person get started with their phone. The digital assistant can lead a blind person to a screen reader, or even read its documentation like a book. If Amazon’s Alexa can read books, then Apple’s Siri can handle reading some VoiceOver documentation.

Moving forward, Anna had no idea how to use VoiceOver. Performing a double tap is incredibly difficult if you don’t have another person there who already knows, and can show you. You may think that it’s just tapping twice on the screen, but you have to do it very quickly compared to a normal tap. A little too slow, and it just thinks you’re trying to tap twice. And even when that’s done, the person has to learn how to swipe around apps, type on the keyboard, answer and end calls, and where the VoiceOver settings are to turn off the hints that are now getting in her way and making her lose focus on what she’s trying to do.

It’s been more than a decade since VoiceOver was shipped. It’s seriously time for a tutorial. And yes, while I know that there are training centers for the blind across the US, I’m pretty sure there are plenty of blind people that never even hear about these centers, or are too embarrassed to go there, or can’t go due to health reasons.

The screen reader, VoiceOver, in this case, was even less helpful than the voice assistant. Imagine that. A voice assistant helped Anna make a phone call. VoiceOver, due to the lack of any kind of usage information, just got in her way. That’s really sad, and it’s something I see a lot. People just turn back to Siri because it’s easier, and it works.

The idea that blind people just “figure it out” is nothing but the shirking of responsibility. Training centers should not have to cover the basics of a screen reader. But we do, and it takes anywhere from two weeks to a month or more of practice, keeping the student from slipping back into the sweet tones of Siri that can barely help them, and showing them all of what VoiceOver can do, which Apple just can’t manage to tell the person. Because it’s so much easier to show off a new feature than to build up the foundations just a tad bit more.

Context matters. Yes, an advanced user won’t need to know all of VoiceOver’s features and how to do things on the phone. But someone who has just gone blind, or someone’s first iPhone, they need this. I don’t care if it’s Siri introducing the screen reader after the person asks how a blind person can use a phone, or if VoiceOver asks the user if they’d like to use it after a good minute or two of being on the home or Lock Screen with no activity, or if a tutorial pops up the first time a blind person uses VoiceOver. But something has got to change, because there are newly blind people every year. It isn’t like we people who were born blind are dying off and no blind people are replacing us, so that VoiceOver looks like this temporary measure until 50 years from now when there are no more blind people left, and VoiceOver can be left to rot and die.

Disabilities, whether we like or not as a society, or even as a disability community, are going to be with us for a long time to come. You see how much money is poured into a big company’s accessibility team? Very little. That’s about how much money is also poured into medical fixes for disability. If it weren’t so, we’d all be seeing, and hearing, and walking, and being neurotypical, and so on, right now. They’ve had a good 40 years of modern medicine. They are not going to fix this in the next 40 years, either. Even if they do fix it for every disability, some people don’t want to typical. They find pride in their disability. That’s who they are.

How a screen reader and operating system work together

For the sake of my readers, the ones left at this point, and myself, I’m going to continue to use Anna for this section as well. Let’s say she has figured out the iPhone thanks to her mom reading articles to her, and now her time to upgrade her phone has arrived. But she’s not found a job yet, and an iPhone is way out of her current budget. Oh, did I tell you she was “gracefully let go” from her old job at Walmart? Yeah, there’s that too. Her bosses just knew that she couldn’t possibly read the labels on the goods she normally organized and put on the shelves, because they don’t have Braille on them.

Now, Anna is looking for a new phone. Her iPhone X is getting slow, and she’d finally paid off the phone with the help of her new SSDI bill, and her mom where needed. She doesn’t like to ask people to do things for her unless absolutely necessary.

So, she goes to a phone store and picks out the best Android phone she can find. It’s not a flagship phone by any means. It’s not even a flagship killer. It’s just a Samsung phone. She pays for it, down a good $250, and walks out with the phone.

Knowing how things generally are now, she holds down the power button, feels a nice little buzz (but not as nice as her iPhone could make), and asks her mom to read her some articles on how to use a Samsung with… whatever VoiceOver it has.

So to make a long story short, she turns on TalkBack, goes through the tutorial built right into the screen reader, and is instantly in love. She can operate the screen reader, without her mom even needing to read her the commands and how to do them. She hits the finish button on the tutorial, and is ready to go.

She first wants to make sure she’s running the latest version of everything. So, she opens the Galaxy Store, and feels around the screen. She finds a link of some kind, then a “don’t show again,” and a “close” button. Puzzled, she activates the close button. Now, she’s put into the actual store. She feels around the screen, finding tabs at the bottom. She’s not really sure what tabs are, but she’s seen them on some iPhone apps. She finds a “menu” tab, and activates that. Feeling around this new screen, she finds the updates item, and double taps on it. Nothing happens. She tries again, thinking she didn’t double tap hard enough, or fast enough, or slow enough. No matter what she tries, nothing happens.

Frustrated and confused, Anna swipes to the left once, and finds a blank area. She decides to double tap it, thinking maybe that’s just how some things work on Android, like how sometimes on a website, a text label is right before a box where the information for that label goes. And she’s right. The updates screen opens, and she’s given a list of updatable apps.

She then decides to look at the Play Store. She’s heard of that before, in ads on the TV and radio, where people are talking about downloading an app. She finds this app a bit easier to use, but also a lot more cluttered. She finds the search item easily, but cannot find the updates section. She decides to just give that a break for now.

She decides to call her mom on her new phone, and see how that works. So, she double taps and holds on the home button to bring up Google Assistant. She knows how to do this because I don’t feel like making up some way she can figure that out. Oh and let’s say her contacts were magically backed up to Google and were restored to her new phone.

TalkBack says “Google Assistant. Tap to dismiss assistant.” Startled, the command Anna was about to give is scattered across her mind. She finds and double taps on the back button. She tries again. The same thing happens. She’d never had that issue before. Trying to talk passed TalkBack, she talks over the voice saying “tap to dismiss assistant,” and says “call”… TalkBack then says “call”. TalkBack then says “double tap to finish.” Assistant then says “Here’s what I found.”

Anna is done for the day. She puts the phone away, and from then on, just uses Samsung’s assistant that I don’t remember how to spell right now. Maybe after a year or two of this, she goes back to an iPhone. Let’s hope she finds out about a training center and gets a job somewhere, and lives happily ever after. Or something.


Having a screen reader work with an operating system requires tons of collaboration. The screen reader team, and indeed, every accessibility team that deals with an operating system, should be in the midst of operating system teams. From user interface elements to how a user is supposed to do things like update apps, or talk to an assistant, these things should be as easy as possible for a person using a screen reader. You, as a developer far removed from the user, don’t know if the person interacting with your software has just lost their vision, or is having a particularly suicidal day, or anything else going on in their lives. It’s not that far of a stretch to say that your software may be the difference between a person giving up, or deciding that maybe being blind isn’t a life-ending event after all.

Let’s go through what happened here. TalkBack, in this case Samsung’s TalkBack that’s still running 13 even though 14 has been out for months now, does have a built-in tutorial. It’s been there for as long as I’ve used it. And that’s extremely commendable for the TalkBack team. Also, they keep updating it.

Now, let’s focus on the Galaxy Store for a moment. An ad wasn’t a big deal in this case. It didn’t jumble up and slow down the screen reader, like some other ads tend to do. It was just a simple link. But in a lot of cases, these ads aren’t readable as ads. They’re just odd things that show up, with buttons to not show the ad again, and to close the ad. It’d be kinda nice if the window read “ad” or something before the ad content.

Now, for the worst thing about the store. The download and update buttons are broken now, with no fix coming any time soon. Operating system developers, even if they’re developing from a base like Android, need accessibility teams including a blind person, Deaf person, and so on. And those disabled people need to have enough power to stop an update that breaks a part of the experience, especially one as important as updating apps.

A screen reader can only read what an app tells it to read, unless there’s AI involved. And for me, there was. When TalkBack landed on that unlabeled item, it said something like “downloads” so I knew to click on that instead of the item label that’s just sitting there outside the actual item. A screen reader is only part of a wider system. A app has to be labeled correctly. An operating system has to be able to tell the screen reader about the item label. And the person has to know how to use the screen reader, and operating system, and app.

And finally, the Google Assistant. Context matters a lot here. What is the user trying to do? Talk to a digital assistant. Would you like it if the person you are talking to suddenly talks over you, and then begins repeating the things you’re saying? No? I thought not. In these kinds of contexts, a screen reader should be silent. The digital assistant, if needed, should be reading any options that appear, like choices for a restaurant or phone number. There should be no doubt here. There are times when a screen reader should be silent, and this is one of the most important. If you are wondering about this, ask a blind person. We do exist, and we’ve had enough of these stupid issues that could be resolved if we are just asked.

Conclusion

Alt-text is just the tip of the iceberg of accessibility. After all, if a person isn’t told how to use the screen reader, what good is Alt-text if they can’t even get to the site to read it? A screen reader is not enough. An app has to be built to be accessible and easy to use, or a person won’t know where to go to do the thing they want. An operating system has to have a good enough accessibility framework to allow a screen reader to, say, turn off its own touch interaction methods to allow a person to play a video game, or play a piano. All of these pieces create the accessibility experience that a person has. And this isn’t even getting into Braille and tactile interfaces.

I hope this post has shown some of what we as blind people experience. After a decade of development, you’d think some of these things, issues, and capability for issues, would be resolved a long time ago. But developers still don’t understand that they’re not developing alone, and that it takes constant cooperation to make things not just accessible as in every button is labeled, but great as in the user is confident in how to use their screen reader, the app they’re in, and the operating system as a whole. Until then, these problems will not be resolved. A company-wide, cultural mindset does not fix itself. Blind people are so used to this that very few even know that they can contact Apple or Google developers, and fewer still think it’ll do any good. They see developers as ultra-smart people in some castle somewhere, way out of mind of the people they develop for, and there’s a reason for that. Developers are going to have to speak with the people that rely on what they build. And not just the people that will tell the developers what they want to hear, either. Developers need to be humble, and never, ever think that what they’re building is anywhere close to good enough, because once they do, they start to dismiss accounts of things not working right, or a need for something better, or different, or more choices. Until these things happen, until people can trust developers to actually be on their side, these kinds of stories will continue, unnoticed and undiscussed.

Categories: accessibility Android apple blindness

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/09/03/contextual-accessibility.html" 

		

Params

		map[categories:[accessibility blindness Android apple] date:2023-09-03 06:41:43 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/09/03/contextual-accessibility.html iscjklanguage:%!s(bool=false) lastmod:2023-09-03 06:41:43 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3520717) publishdate:2023-09-03 06:41:43 -0600 -0600 title:Contextual Accessibility type:post url:/2023/09/03/contextual-accessibility.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c20000), (*hugolib.pageOutput)(0xc000c20120), (*hugolib.pageOutput)(0xc000c20240), (*hugolib.pageOutput)(0xc000c20360), (*hugolib.pageOutput)(0xc000c20480), (*hugolib.pageOutput)(0xc000c205a0), (*hugolib.pageOutput)(0xc000c206c0), (*hugolib.pageOutput)(0xc000c207e0)}, pageOutput:(*hugolib.pageOutput)(0xc000c20000), pageCommon:(*hugolib.pageCommon)(0xc00078f400)}	
		

TalkBack 14: Rushed Steps in the Right Direction, but still so far behind

I’ve talked a lot about Android on this blog. I love the idea, and the operating system is nice, but accessibility could be so much more. I’ve had my Galaxy S20 FE (5G), for about a year and a half or so. I’ve seen the changes from Android 11, to 12, and finally to 13. TalkBack has improved steadily over that year and a half, adding Braille support, which I thought wouldn’t come for another five to ten years. Spell checking was added in TalkBack 13. Text recognition in images and unlabeled buttons was added in TalkBack 12.1. Icon descriptions were added in TalkBack 13.

In this article, though, I’ll overview the changes in TalkBack 14, the ones I have access to, that is. I’ll get to that later. I’ll also talk about the problems facing Android that isn’t really about TalkBack, but is more about the accessibility framework, and what apps can and can’t do. So this will be a sort of continuation from my other Android articles, more than just a “what’s new in TalkBack” style article.

TalkBack 14, Lots of Braille

TalkBack 14 is a good iteration of where TalkBack 13 started. TalkBack now has many more commands, both for Braille displays and for its Braille keyboard. One can now move around an edit field by words and lines, not just characters, using the onscreen Braille keyboard. One can also select text, and copy and paste it using the same keyboard. You don’t have to dismiss the keyboard just to do all that. To be fair to iOS, you can do that with Braille Screen Input, but the commands are not documented in either Apple’s documentation, or in the VoiceOver settings. In TalkBack settings, those commands are clearly documented and explained.

TalkBack 14 now supports the NLS EReader, which is being freely distributed to NLS patrons. By the end of the year, all 50 states will have the EReader. You do have to connect the display to your phone via USB C, and the cable I had on hand shorted out, so I have to find another one. But I was able to use it with a USB hub, which further made the setup less mobile, but it did work. The commands, though, were rather more complicated than I expected. I had to press Enter with dots 4-5 to move to the next object. Space with Dot 4 was used to move to the next line, and Space with Dot 1 was used to move to the previous line. So I quickly moved back to using the EReader with the iPhone. I’ll practice with it more, but for now it just doesn’t feel as practical as using the EReader, over Bluetooth, on the iPhone, with its simpler commands.

A window into Images

TalkBack 14 has a new screen of choices, where you can enable options regarding image recognition. You have the usual text recognition, and icon recognition, but the screen refers also to “image recognition,” similar to what VoiceOver can do. This is something I’ve wanted for a long time. Some people have a third option, “image descriptions,” but I don’t have that option. Google often rolls out features to a small subset of users, and then rolls it out to everyone else after weeks or months of testing. We’ll have to see how that works out.

Of note, though, is that whenever one gets an iOS update, one gets all the new features right away. There is no rollout of features for VoiceOver, it’s just there. TalkBack 14, as a public release, should have all the features available to everyone at launch, in my oppinion. They could always label image descriptions as “beta.”

The Accessibility Framework

As I’ve said before, the operating system is the root of all accessibility. If the accessibility framework is limited, then apps are limited in what they can do as far as accessibility is concerned. This is why I’ve been so critical of Google, because Android’s accessibility framework, and what apps can communicate to TalkBack, is limited. I’ll give a few examples.

Kindle

I love the books I can get on Kindle. I love that I can read them on just about all of my devices. But not all Kindle apps are created equally. The app on the iPhone is great. Using VoiceOver, I just swipe down with two fingers and the book is read to me. I can move my finger up and down the screen to read by line. I can use a Braille display and just scroll through the book, no turning pages required since it happens automatically. On Android, however, the Kindle app is more limited.

When you open a book in Kindle for Android, you find a page, with a “start continuous reading” button. All this button does is pipe the text of the book out to the Android speech engine. This distinction is important. On iOS, since VoiceOver is controlling things, you can quickly speed up, slow down, pause and resume, or change the voice quickly. On iOS, you can read by word or letter, and most importantly, read easily with a Braille display.

On Android, you can move your finger down the page to hear lines of text, which are sent to TalkBack as announcements. But if you try to have TalkBack read the book, it won’t get passed the current page. The same is even more true with Braille; you have to turn pages manually, using the touch screen because its not actually TalkBack that’s turning the page. So you have to keep touching the phone’s touch screen in order to continue interacting with the app. Braille displays have keys for a reason. You shouldn’t have to use the touch screen to do anything while using a Braille display with your phone. Most Braille display users keep their phone in their pocket while they use it from their displays.

With a lot of other book-reading apps, you can just lock the screen, and just listen to the book. Many blind Android users love that, and find it superior to reading with a screen reader. However, the Kindle app doesn’t even satisfy that. Whenever the screen times out and locks, after that page is finished, the page is turned, but the speech stops. You have to unlock the screen, and then press “start continuous reading” again.

Now, if TalkBack could read the book, and turn the page, the experience would be much better. But Google’s accessibility framework has moved at a glacial pace, throughout the ten or fifteen years of Android, and iOS, development. While Apple opened up API’s to developers, so that VoiceOver could turn pages while reading, Google has not even added that feature to their own reading app. Instead, Play Books uses a web view, and just detects when the user has gone beyond the last element on the page, and then just turns the page. At least, that’s what I think is happening. I obviously don’t have access to the source code of the Play Books app.

Mortal Kombat

Games are becoming more and more important in the mobile ecosystem. Mobile games are, in some cases, more popular than console games. But mobile games are sometimes very hard to make accessible. Take the Mortal Kombat game. You have an interface where you choose a game mode, make a team of fighters, upgrade cards, and change settings. Then, you have the fight mode, where you tap to attack, swipe for a special attack, and hold two fingers on the screen to block. On iOS, the developers have made the buttons visible to VoiceOver, and added labels to them. They’ve shown the text elements, where you “tap to continue”, to VoiceOver, and allowed the double tap to advance to the next screen. That part, I believe, could be done on Android as well.

The real fun is in the battles, though. Once a fight starts, on iOS, VoiceOver is pushed out of the way, so to speak, by a direct touch area. This allows taps and swipes to be sent directly to the app, so that I can play the game. While I’m fighting, though, the game sends text prompts to VoiceOver, like “swipe up,” or “Tap when line is in the middle.” I’m not sure exactly what the last one means, but “swipe up” is simple enough. This allows me to play, and win, battles.

Unfortunately for Android users, though, this “direct touch area” is not possible. Google has not added this feature for app developers to take advantage of. They theoretically could, but they’d then have to make an accessibility service for the app, and then make sure that the service is running when the app runs. Users are not going to turn on an accessibility service for a game, and developers are not going to spend time dealing with all that for the few blind people, relatively speaking, on Android.

Catching the Apple

Google, for the last few years, has been trying hard to catch up to Apple. They have a long way to go. Apple, however, hasn’t stayed still. They have a decade worth of built-up experience, code, frameworks, and blind people who, each time they try Android, find that it falls short and come back to iOS. I’m not saying Apple is perfect. And each time a wave of blind people try Android, a few find that it works for what they need a phone for.

As more and more blind people lean into using a phone as their primary computing device, or even their secondary computing device, accessibility is going to be more important than ever. We can’t afford half-baked solutions. We can’t afford stopgap measures. Companies who build their services on top of these platforms will do what they can to make their apps accessible, but they can only do so much. In order to make better apps, developers need rich, robust API’s and frameworks. And right now, that’s Apple. And I’ve gotten tired of holding my breath for Google. I’m just going to let out that breath and move on. I’ll probably keep my Android phone around, but I’m not going to use it as my primary device until Google gets their act together.

Some Android users will say that I’m being too harsh, that I’m not giving Google enough time, or that I’m being whiny, or radical, or militant. But it took Google ten or so years to add commands that used more than one finger. It took them ten years to add Braille support to their screen reader. It took them ten years to add spell checking. I’m not going to wait another ten years for them to catch up to where Apple was a good three years ago.

Categories: accessibility Android blindness Google iPhone productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/07/17/talkback-rushed-steps.html" 

		

Params

		map[categories:[Google accessibility blindness Android productivity iPhone] date:2023-07-17 04:18:09 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/07/17/talkback-rushed-steps.html iscjklanguage:%!s(bool=false) lastmod:2023-07-17 04:18:09 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3405931) publishdate:2023-07-17 04:18:09 -0600 -0600 title:TalkBack 14: Rushed Steps in the Right Direction, but still so far behind type:post url:/2023/07/17/talkback-rushed-steps.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc8ea0), (*hugolib.pageOutput)(0xc000bc8fc0), (*hugolib.pageOutput)(0xc000bc90e0), (*hugolib.pageOutput)(0xc000bc9200), (*hugolib.pageOutput)(0xc000bc9320), (*hugolib.pageOutput)(0xc000bc9440), (*hugolib.pageOutput)(0xc000bc9560), (*hugolib.pageOutput)(0xc000bc9680)}, pageOutput:(*hugolib.pageOutput)(0xc000bc8ea0), pageCommon:(*hugolib.pageCommon)(0xc00078ef00)}	
		

Why I use Large Language Models

The debate over AI rages on, and I find myself caring less and less as the tug of war between the sides, one saying that AI is a threat to humanity and the other side saying that AI can do lots of amazing stuff, and definitely couldn’t take our jobs, becomes more fierce. No, AI cannot take our jobs. Rich people can take our jobs and give it to AI, though.

This post isn’t going to be about the rich people that bend AI, and anything else they can, to their will. This post is about why I use large language models, especially multimodal ones, and why I find them so useful. A lot of people without disabilities, particularly those who aren’t blind, probably won’t understand this. That’s okay. I’m writing this for myself, and for those who haven’t gotten to use this kind of technology yet.

Text only models

ChatGPT was the first large-language model I used. It introduced me to the idea, and to the issues of the model. It couldn’t give an accurate list of screen reader commands. But it could tell me a nice story about a kitten who drinks out of the sink. From the start, I wondered if I could feed the model images. I tried with Ascii art, but it wasn’t very good at describing that. I tried with Braille art, but it wasn’t good at that either. I even tried with an SVG, but it couldn’t fit the whole thing into the chat box.

I was disappointed, but I kept trying different tasks. It was able to explain output of some Linux commands, like Top, which doesn’t read well with a screen reader. It was even able to generate a little Python script that turned a CSV file into an HTML table.

As ChatGPT improved, I found more uses for it. I could ask it to generate a description of a video game character, or describe scenes from games or TV shows. But I still wanted it to describe images.

My Fascination with Images

I’ve always wanted to know what things look like. I’ve been blind since birth, so I’ve never seen anything. From video games to people to my surroundings, I’ve always wondered what things look like. I guess that’s a little strange in the blind community, but I’ve always been a little strange in any community.So many blind people don’t care what their computer interface looks like, or what animations are like, or even if there is formatting information in a document or book. I do. I love learning about what apps look like, or what a website looks like. I love reading with formatted Braille or speech, and learning about different animations used in operating systems and apps. I find plain screen reader speech, without sounds and such, to be boring.

So, when I heard about the Be My Eyes Virtual Volunteer program, I was excited. I could finally learn what things look like. I could finally learn what apps and operating systems look like. I could send it pictures of my surroundings, and get detailed descriptions of them. I could send it pictures of my computer screen, and understand what’s there and how it’s laid out. I could even send it pictures from Facebook or Twitter, and get more than a bland description of the most important parts of the image.

I began trying the app, with saved pictures and screenshots. The AI, GPT4’s multimodal model, gave excelent descriptions. I finally learned what my old cat looks like. I learned what app interfaces like discord look like. I sent it screenshots of video games from Dropbox, and learned what some video game characters and locations look like.

Now, it’s not always perfect. Sometimes it imagines details that aren’t there. Sometimes it doesn’t get the text right in an image. If a Large Language Model is a blurry picture of the web, I’d rather have that than a blank canvas. I’d rather see a little than not at all. And that’s what these models give me. No, it’s not real site. I wouldn’t want to wait a good 30 seconds to get a description of each frame of my life. But it’s something. And it’s something that I’ve never had before.

Feeding the Beast

A lot of people will say that these models just harvest our data. They do. A lot of people will then say that I shouldn’t be feeding their Twitter posts, video games, interfaces, comic books, and book covers into the models. My only response to that is that if all these things were accessible to me, I wouldn’t have to feed them to the models. So if you don’t want your pictures in OpenAI’s next batch of training data, add descriptions to them. If you don’t want your video game pictures used in the next GPT model, make your game accessible. If you don’t want your book covers used in the next GPT model, add a description to them. That’s just all there is to it. I’m not giving up this new ability to understand visual stuff.

Categories: accessibility blindness sight

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/07/14/why-i-use.html" 

		

Params

		map[categories:[accessibility blindness sight] date:2023-07-14 14:23:12 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/07/14/why-i-use.html iscjklanguage:%!s(bool=false) lastmod:2023-07-14 14:23:12 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3400129) publishdate:2023-07-14 14:23:12 -0600 -0600 title:Why I use Large Language Models type:post url:/2023/07/14/why-i-use.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc85a0), (*hugolib.pageOutput)(0xc000bc86c0), (*hugolib.pageOutput)(0xc000bc87e0), (*hugolib.pageOutput)(0xc000bc8900), (*hugolib.pageOutput)(0xc000bc8a20), (*hugolib.pageOutput)(0xc000bc8b40), (*hugolib.pageOutput)(0xc000bc8c60), (*hugolib.pageOutput)(0xc000bc8d80)}, pageOutput:(*hugolib.pageOutput)(0xc000bc85a0), pageCommon:(*hugolib.pageCommon)(0xc00078ea00)}	
		

The Rings of Google's Trusted Testers program

Over the years, I’ve owned a few Android devices. From the Samsung Stratosphere to the Pixel 1 to the Braille Note Touch, and now the Galaxy S20 FE 5G. I remember the eyes-free Google group, where TalkBack developers were among us mere mortals. I remember being in the old TalkBack beta program. I remember when anyone in the Eyes-free group could be in the beta. And now, that is no longer the case.

In this post, I’ll talk about the Accessibility Trusted Testers program, how it works in practice, in my own experience, and how this isn’t helpful for both TalkBack as a screen reader, and Google’s image as a responsive, responsible, and open provider of accessibility technology. In this article, I will not name names, because I don’t think the state of things results from individual members of the TalkBack or accessibility team. And as I’ve said before, these are my experiences. Someone who is more connected or famous in the blind community will most certainly have better results.

The outer ring

When you open the link to join the accessibility trusted tester program, you’ll find this text:

Participate in early product development and user research studies from select countries.

After signing up, you’ll get an email welcoming you to the program. Afterwards, you get emails about surveys and sessions you can do. This isn’t just for blind people, either. There are a lot of sessions done for people with visually impaired people, Deaf people, and wheelchair users. And yes, there are a lot more of them than blind people. A good many of them require that you be near a Google office, so require transportation. I won’t go into detail about what the sessions and surveys are about, but this overview should give you a good enough idea.

The Inner Ring

Now we get into the stuff that I take issue with. There is no way for someone not in the loop to know. If you contact someone in the accessibility team at Google, you can ask to be placed in the TalkBack and/or Lookout testing programs. Depending on who you ask, you may or may not get any response at all. Afterwards, the process may get stuck in a few places, either in searching for you in the program, calling out to another person, and so on. And no, I’m not in either private beta program. The last time I’ve heard from them is two months ago now.

The things I have issues with are many, and I’ll go over them. First, when someone signs up for these trusted tester programs, they think that, because it’s a “tester” program, you’ll gain access to beta versions of TalkBack and so on. You don’t.

Second, some of these sessions require you to travel to Google’s offices. There are blind people scattered across states and countries and provinces, and few Google offices. So, if a blind person wants to attend a session, they’ll have to travel to California to do so. And that means that only Californian blind people, who are in the program, will even know about the study and attend.

And third, the biggest, is this. When the program opened up after the demolition of the eyes-free group, the people using Android for the longest flooded in. So, throughout all these years, it’s been them, the people used to use Android, providing the feedback. People who haven’t used iOS in years, people who don’t care about images and who have found their preferred apps and stick with that. So, when new people come to Android, the older users have a bunch of third-party apps, for email, messaging, launchers, and so on. Sure, the new users can talk about how the first-party experience is on the Blind Android Users mailing list and Telegram group, but the older users always have some third-party way of doing things, or a workaround of “use headphones” or “mute TalkBack” or “use another screen reader” or “Go back to the iPhone”. And I’ve nearly had enough of that. Sighted people don’t have to download a whole other mail client, or mute TalkBack while talking to Google Assistant, or use a third-party Braille driver like BRLTTY, or use and iPhone to read Kindle books well in Braille or talk to the voice assistant without being talked over.

Also, the Trusted Testers program only covers the US and maybe Canada. Most blind Android users are from many other countries. So, their voices are, for all intents and purposes, muted. All those devices that they use, the TalkBack beta program will not catch. A great example of this is spell checking released in TalkBack 13.1. On Pixels, when you choose a correction, that word is spelled out. On Samsung and other phones, it’s not. It makes me wonder what else I’m missing by using a non-Google phone. And that’s not how Android is supposed to work. If we, now, have to buy Google phones to get the best accessibility, how is that better than Apple, where we have to buy Pro iPhones to get the most and best features?

How this can be Fixed

Google has a method by which, in the Play Store, one can get the beta version of an app. Google can use this for TalkBack and Lookout. There is absolutely nothing stopping them from doing this. Google could also release source code for the latest TalkBack builds, including beta and alpha builds, and just have users build at their own risk. Google could open the beta programs to everyone who wants to leave feedback and help. After all, it’s not just Google phones that people use. And the majority of blind people don’t use Pixel phones. blind people also have spaces for talking about Android accessibility, primarily the Blind Android users list and Telegram group. I’d love to see Google employees hanging out there, from the TalkBack team to the Assistant team, the Bard team, and the Gmail and YouTube teams. Then we could all collaborate together on things like using TalkBack actions in YouTube, moving throughout a thread of messages in Gmail, and having TalkBack not speak over someone talking to the assistant, with or without headphones in.

How can I help

If you’re working at Google, talk to people about this. Talk to your team, your manager, and so on. If you know people working at Google, talk to them. Ask them why all this is. Ask them to open up a little, for the benefit of users and their products, especially accessibility tools. If you’re an Android user, talk to the accessibility folks about it. If you’re at a convention where they are, ask them about this. If you’re not, they’ve listed their email addresses. I want anyone who wants to be able to make Android accessibility, and TalkBack, the best that it can be, to be able to use the latest software, use beta software, provide feedback directly to the people making it. Google doesn’t need to be another Apple. Even Apple provides beta access, through iOS betas, to any eligible iPhone. Since Samsung barely does any TalkBack updates until half a year or more later, it’s seriously up to Google to move this forward. I’ve known people who plug their phone into a docking station, and use it as a computer. I want blind people to be able to do that.

In order to move this forward, though, we need to push for it. We need to let Google know that a few people who have been using Android for the past 10 years isn’t enough. We need to let them know that there are more countries than the United States and Canada. We need to let them know that we want to work with them, to collaborate with them, not for them to tell us what we want through a loud minority.

TalkBack doesn’t have as many options and features as Voiceover, but it’s started out on solid ground. ChromeVox doesn’t have as many options and features as JAWS but has started out on a solid foundation. Together, though, the community and Google can make both of these platforms, with the openness of Android, on both phones and ChromeBooks, and Linux containers on chromeBooks, the best platforms they can be! All it takes is communication!

Categories: accessibility Android blindness Google productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/07/02/the-rings-of.html" 

		

Params

		map[categories:[Google accessibility blindness Android productivity] date:2023-07-02 14:11:25 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/07/02/the-rings-of.html iscjklanguage:%!s(bool=false) lastmod:2023-07-02 14:11:25 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3315799) publishdate:2023-07-02 14:11:25 -0600 -0600 title:The Rings of Google's Trusted Testers program type:post url:/2023/07/02/the-rings-of.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc3c20), (*hugolib.pageOutput)(0xc000bc3d40), (*hugolib.pageOutput)(0xc000bc3e60), (*hugolib.pageOutput)(0xc000bc8000), (*hugolib.pageOutput)(0xc000bc8120), (*hugolib.pageOutput)(0xc000bc8240), (*hugolib.pageOutput)(0xc000bc8360), (*hugolib.pageOutput)(0xc000bc8480)}, pageOutput:(*hugolib.pageOutput)(0xc000bc3c20), pageCommon:(*hugolib.pageCommon)(0xc00078e500)}	
		

On Regaining sight, and the need for Blind Culture

So this is in response to This blog post regarding Mr. Beast’s Blindness video, which shows the perspective of a person that still has some remaining vision. I, however, have none. I am completely blind. I wanted to write a response that shows my perspective on the video about the possibility of regaining sight. I speak for myself, not for anyone else in the blind community or culture. I’ll also talk about the need of a Blind culture, and the cheapening of such culture by these types of videos.

The video, titled 1000 Blind people See for the First Time was hard for me to watch, in several ways. Firstly, the title is a lie. I’ll go through that later. Second, a lot of it is visual in nature, with very little description. Third, it was not described, using any kind of audio description. Audio description is where a video has another audio track that plays alongside the original, where a person is describing the events that happen during the video. Not just the text, or the main idea as in this video, but the people, places, actions in the video. Sometimes, Youtube videos have “separate but equal” versions with descriptions, but I tried searching for it, but could not find it, using my admittedly slower Mac with Safari and VoiceOver. Still, if this video was about blind people, it should be suitable for blind people as well. It reminds me of the accessibility overlay companies, which will gladly post on social media images without descriptions.

The Big Lies

Let’s pick apart the title of the video. I’m not going to concern myself with 1000. That seems cruel to me, just picking an arbitrary number like that, but let’s move on. “Blind people.” Were these people blind? No, they weren’t. In the culture I live in, they would be called “low vision” or “visually impaired.” Mr. Beast cured people with cataracts. In the worst circumstance, maybe that could mean blind. But in the video, there were people that had been blind for four months. Imagine having a disability for four months. Comparing that to my life of absolutely no vision is very harmful, and cheapens the lives of those who are actually blind, and makes me feel as if I’m not even worthy to be called blind, or that my experiences are absolutely worthless, that my work is worthless, and that my life is meaningless. Let’s move on to the next part. “For the First Time.” Again, a lot of these people could see before. Maybe they couldn’t see perfectly, but their eyes could perceive enough to live mostly normal lives. They could see people’s faces, with glasses. In fact, one of the people in the video said something like “Well I don’t need these anymore.” I don’t know for sure, because again the video wasn’t described, but I’m pretty sure she was referring to glasses. You know what? If I could see well, even enough to not need a screen reader, I’d happily, happily take glasses over what I have now. If I needed glasses and a screen magnifier to see a computer or television screen, I’d take that in a heartbeat.

The cheapening of Blind Culture

The next time you meet a Deaf person, I want you to ask them how they would respond if a person with some hearing loss approached the Deaf person, and told them that they, the person with hearing loss, was also Deaf. Now, I don’t know any Deaf people personally, and am not Deaf myself. But I’m pretty sure it wouldn’t go over well, to say the least. The reason I think this is that I never hear any hard of hearing people call themselves Deaf. Why? Because their experience is not the same as a Deaf person. The same applies to Blind people, if indeed we want to be a stronger culture. We must be allowed to have our own words, our own experiences, our own culture. To do otherwise is to weaken and cheapen the bond shared by all totally blind people.

If, instead, we allow videos like this to claim our words, our terms, and our experiences, we’ll need to retreat to where Autistic people are now, having to call themselves “Actually Autistic.” Why? Because the broader culture claimed their word and their way of finding and talking about one another, and explaining themselves. So now they have to use another hashtag to show who they actually are. And honestly, we have so little power as it is. If we allow the term “blind” to be used for those who have usable vision, the general population will then think that there are no people that can’t see at all, and when we tell them that we cannot see, they’ll think we’re lying. This already happens sometimes to me. And it’s yet another slap in the face.

And here we get to the biggest problem with this video: its effect on the general population. Now, if people don’t look any further, which many conservatives have proven that they will not, people will think that blindness is curable in 10 minutes, which it’s not. There are so many causes of blindness, and so many do not have cures. A lot of blind or low vision people do not want to be cured, fine with the life they live or the vision that they do have. And now, sighted people have a video to point to. “Dude it’s just ten minutes,” they’ll say. “Are you so anti-vaxxxxx that you won’t even stop being a burden on society?” Parents will guilt-trip their kids. Husbands will guilt-trip their wives. And for what? A cure that will more than likely not apply to them.

And what of people, like me, who want to even see as low vision people do? What of people who would give much for a cure, are shown this video and asked “Hey dude, check your mail! Maybe you were one of the 1000! Maybe you won!” Just another slap in the face. An undescribed video, an arbitrary number, a single cure, for those who are not actually blind.

Another aspect of this is that we do want to experience the world. Why else would we want pictures described, or to be able to watch videos or television, or play video games? Yes, we have our own culture. We have our audio books, text to speech voices, audio games, and screen readers. But we want to know the sighted culture too. Sure, some may not want to see, enjoying who they are. Others beg their god or science to be able to see . But whatever way we experience the world, either through sight, visual interpretation, or reading books about the world, we love experiencing it. Even those who do not want to see the world still live in it. But this video, with its outright lies and false hope and capitalistic choosing of just 1000 people, doesn’t help anyone. From the low vision people that were “left behind,” to the blind people for whom there is no cure, to the general public who will now have another excuse to shun us, it does more harm, I think, than good.

Categories: blindness culture sight

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/02/12/on-regaining-sight.html" 

		

Params

		map[categories:[blindness culture sight] date:2023-02-12 16:45:28 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/02/12/on-regaining-sight.html iscjklanguage:%!s(bool=false) lastmod:2023-02-12 16:45:28 -0600 -0600 layout:post linkedin:map[id:%!s(int=7030668259326607360) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109854191612120082) username:devinprater] medium:map[id:36a2bbf7030d username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1807122) publishdate:2023-02-12 16:45:28 -0600 -0600 title:On Regaining sight, and the need for Blind Culture type:post url:/2023/02/12/on-regaining-sight.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc2a20), (*hugolib.pageOutput)(0xc000bc2b40), (*hugolib.pageOutput)(0xc000bc2c60), (*hugolib.pageOutput)(0xc000bc2d80), (*hugolib.pageOutput)(0xc000bc2ea0), (*hugolib.pageOutput)(0xc000bc2fc0), (*hugolib.pageOutput)(0xc000bc30e0), (*hugolib.pageOutput)(0xc000bc3200)}, pageOutput:(*hugolib.pageOutput)(0xc000bc2a20), pageCommon:(*hugolib.pageCommon)(0xc000635900)}	
		

Productivity on mobile platforms

Over the past few years, I’ve seen something that kind of troubles me. While people on iPhones Write books on using the iPhone on their iPhones, clear out their Email on their Apple Watch and manage the rest on their iPhones, and use their iPhones as their primary computing devices, Android users feel like one cannot be productive on any mobile system. So, here’s the thing. When you are around sighted people, even at a job sometimes, what are they using? Their computer? No. They’re on their phone. Maybe it’s an iPhone, or perhaps it’s an Android; it doesn’t matter. Either way, people are doing all kinds of things on their phones. When you go to a center of blind people, what do you see? People on their computers? Sometimes, but for younger people, they’re on their iPhones.

I’ll talk about the differences between iPhone or Android later. But this cannot be understated. The phone is now, for the majority of sighted, and even blind, people, their main computing device. And even a few older blind people I’ve talked to, they would rather not use a computer now. They’re all over their iPhone. So, what does this kind of productivity look like?

Quick flicks are best

Fast access to information is important. But being able to act on that information is even more significant. If I can’t quickly archive an email, I may not mess with using a Mail app that much. I want to get through my inbox, quickly, able to read through threads of info. The iPhone does this well, allowing me to flick up or down, double tap, and the email is out of sight. Within a conversation, I can move to the previous or next message, and archive, reply, or flag an individual message in that conversation. On Android, in Gmail, I can act upon a conversation, but inside a conversation, there are no quick actions. One must navigate through the message, along with any quoted or signature text, find a button, and double tap. Yes, there are other mail clients. Aqua mail comes close to being like the iPhone mail app. But it has no actions, and if one can get an honestly great mail client out of an iPhone without needing to buy another app, why should someone consider Aqua mail and Android?

A Book on a phone on a phone

I can’t get over how good Ulysses for iOS and macOS is. While I’m using Ulysses for Mac right now, I still consider what a person was able to make with just an iPhone, an app, and a Bluetooth keyboard. You may then say, “Well, if you’ve got a keyboard, you might as well have a laptop.” To which I would show a marvelous invention, called the pocket. A phone in your pocket, headphones in your ears, a keyboard in your lap (particularly one of those slim Logitech keyboards), and you’ve got a nice writing environment that is much less bulky than a laptop. A laptop with its trackpad and screen adding weight and thickness, along with the CPU and hard drive.

Next is the app. I’ve tried a lot of writing apps on Android. From iA Writer to a lot of Markdown note apps, I looked for a replacement for Ulysses that would give me the power that allowed a blind person to write an entire, large book on his iPhone. And I couldn’t find it. From unlabeled buttons, to no way to navigate by heading or link inside the document, to no way to link chapters together and export as a book, none of the apps were viable. This is not to imply that no app will exist in the future. And this does not imply that Android will not have a good enough accessibility framework to allow the creation of such apps later on. But right now, the iPhone, the most locked down operating system in the mobile space, has allowed a level of creativity from a writer which was before only seen on Windows beforehand. Furthermore, it allows a far more accessible writing environment, enabled by Markdown.

Android, meanwhile, is still trying to get dictation without TalkBack speaking over the person dictating, or Google Assistant without TalkBack loudly talking over it, phone calls where you don’t hear “keypad button” at the start of each call, image descriptions, a pronunciation dictionary, and so on. This isn’t to imply that the iPhone and VoiceOver are perfect. They are not, and amass bug after bug with every release. But, as of now, the iPhone is still the most productive platform. Android is coming around, quickly speeding up over the last year or so. I really hope it gets to the point where we can not only write books on our phone, but can also create apps, music, edit audio and video efficiently and effectively. At least, I’d love to be able to do any office work a job may require, with our phones hooked up to USB-C docking stations and keyboards and external displays.

More than likely, though, VoiceOver on the iPhone will continue to decline. TalkBack will reach where VoiceOver is right now, and stop because they ran out of ideas. The blind community will continue having to come up with workarounds, like not using the Notification Center when a Braille display is connected, or using Speak Screen on older iPhones from 2020 because VoiceOver is so badly optimized that it runs out of memory while reading an online article. Meanwhile, TalkBack will gain image descriptions, and it’ll be more than “gift card,” “blue sky,” on an app where you clock in and out of work, which is what VoiceOver does. TalkBack already speaks the text of the button, rather than describing the button. Yes, the button is unlabeled.

But the thing that really holds the iPhone up is the apps. Lire for RSS, Overcast for podcasts, Ulysses for writing, Seeing AI for recognition, and so on. And there’s an actual website with lists of apps for iOS. Android has podcast apps, RSS apps, writing apps, and recognition apps. And some, like Podcast Addict and Feeder, are great apps. But they don’t approach the accessibility of their iOS counterparts. Podcast Addict, for example, has the following layout when viewing episodes of a podcast: “Episode name, episode name button, and contextual menu Botton. Overcast, on the other hand, simple has a list of episodes. Android pros get around this by saying one should just feel one’s way down the screen, and scroll forward. What if one is using a Braille display or Bluetooth keyboard? What if one is both blind and lacks dexterity in the hands, so they need to use switch access? This is the kind of thing that iOS already has: a good, clean user interface. Sure, right now, it’s fallen into disrepair. Sure, you’ve got bugs crawling out from the walls. Sure, it feels sluggish on iPhones from just two years ago. But it’s still the best we have.

And this is where a sighted person cannot understand. To them, an iPhone X R is as good as the latest Galaxy phone, or even the latest iPhone, not mentioning the camera. Developers plan for sighted use. They make sure things look good, and flow smoothly, from the CPU on up to the touch screen. And yet, things work so differently to blind people. Adding a podcast episode to the queue may take a simple swipe on Android, but takes several swipes and taps for a blind Android user. And that’s why productivity, a good accessibility framework, apps development tools that automatically make a view as accessible as possible, and a good, high-quality screen reader are so important. And it takes all of that for a blind person to be productive, and that’s why most blind people in developed countries choose iPhone, every time.

Categories: Android blindness iPhone productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/02/12/productivity-on-mobile.html" 

		

Params

		map[categories:[blindness Android productivity iPhone] date:2023-02-12 12:58:08 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/02/12/productivity-on-mobile.html iscjklanguage:%!s(bool=false) lastmod:2023-02-12 12:58:08 -0600 -0600 layout:post linkedin:map[id:%!s(int=7030611045131956224) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109853297620296750) username:devinprater] medium:map[id:2c8d5445f74c username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1807023) publishdate:2023-02-12 12:58:08 -0600 -0600 title:Productivity on mobile platforms type:post url:/2023/02/12/productivity-on-mobile.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc3320), (*hugolib.pageOutput)(0xc000bc3440), (*hugolib.pageOutput)(0xc000bc3560), (*hugolib.pageOutput)(0xc000bc3680), (*hugolib.pageOutput)(0xc000bc37a0), (*hugolib.pageOutput)(0xc000bc38c0), (*hugolib.pageOutput)(0xc000bc39e0), (*hugolib.pageOutput)(0xc000bc3b00)}, pageOutput:(*hugolib.pageOutput)(0xc000bc3320), pageCommon:(*hugolib.pageCommon)(0xc00078e000)}	
		

Quick post: On Operating Systems and Communities

To me, the core of accessibility is the operating system. What it allows, how easy the accessibility APIs are to both be used by developers, and upgraded by the operating system development team or community. At the same time, the support system of accessibility is the community. Without a community of users and developers, the accessibility of an operating system will not flourish.

In this post, I’ll discuss the different approaches I’ve seen in operating systems, and how users and developers in the community, primarily the blind community, have improved it. These are my opinions and thoughts, and others in the blind community, like fans of Linux, Android, and Windows, will feel differently.

MacOS and Android

These two operating systems are somewhat similar, in that they’re just now really getting support from their company on accessibility. MacOS had a good base, but is just now getting huge performance improvements and features that have been requested for years. Android had a stable base, if nothing else, and is also getting features that are making it more worthwhile to be used.

MacOS has had a good-sized community of blind users since the release of the iPhone 3GS and VoiceOver, maybe even earlier, to macOS Leopard. The community isn’t anywhere as large as the Windows community we’ll discuss later, but is slightly beginning to pick up steam, I’d say, as Windows falls into disrepair and macOS improves.

The community on Android primarily improves access by writing apps that work as accessibility services, like the Commentary screen reader, or the Advanced Braille Keyboard. Developers on macOS can script VoiceOver to a limited extent, allowing them to get information on, for example, what song is currently playing in Music, or how much free space is on their system. Using shell scripting with VoiceOver Applescripts could give powerful options for controlling a Mac computer with VoiceOver. Other apps have also been created, like VOCR for OCR and image recognition. An app called HammerSpoon is also used to add user interface sounds to macOS, like when a device is connected or disconnected.

Windows and iOS

Windows is the main operating system, besides iOS, that blind people use. It’s very open, but accessibility has markedly devolved since even Windows 10 a few years ago. From focus issues to keyboard commands being taken up by something blind people will not use, and when that feature is turned off, the keyboard shortcut is just mapped to nothing, Windows is declining. The community, however, is keeping it alive with addons and scripts for screen readers, apps for blind people, and the plentiful voices that have been created for Windows over the years.

Addons for NVDA include helpful tools like better handling of tables and the Windows console. There are also super-power tools like an automatic translation tool, a note-taker that can transform Markdown to Word format, and a talking clock that speaks the time every 30 minutes. Apps for the blind include a ton of games, MUD client packages with ready-to-go sound packs and scripts. Braille translation, YouTube audio downloaders, and in the past, even a full social network for the blind with a custom audio interface have also been created. Windows also allows for third-party screen readers, like NVDA and JAWS, alongside their Narrator offering.

iOS, on the other hand, is closed to addons and system-level extensions. This means that Apple’s small accessibility team has to offer all of VoiceOver’s features, and fix all the bugs. However, many apps exist to make things a lot better on iOS, from numerous games, to navigation apps, to text recognition and identification apps, and so on. Android has a good bit of these, but iOS has them all. Sure, Android has some that are smaller apps, but they just use either Microsoft or Google’s vision frameworks to do what the other apps on iOS already do. iOS also has been the first platform to do many things, like offer comprehensive Braille support, audible graphs, image recognition, accessibility actions, and screen recognition. On and smaller things like using actual pronounced audio as the substitute pronunciation, allowing someone to just speak how they want a word to be pronounced. Android doesn’t even have a pronunciation dictionary system-wide. No, pronunciation dictionaries for voices like Eloquence and Vocalizer don’t count. Oh, speaking of Eloquence, Apple was the first mainstream company to give many blind people what they want, Eloquence on their computers and phones. Now, regardless of what architecture iOS or macOS goes to, they’ll always have Eloquence. Compare that to Android where once phones go 64-bit, Eloquence is gone.

My thoughts

So, this is what’s wrong with Linux; it has very few blind people using it, and very few sighted people working on the accessibility frameworks. Windows just has such a history behind it that it cannot make itself more appealing to blind people. Apple I think need many more accessibility staff, and many more testers, especially in Braille, and management that will listen when they say a bug needs to hold back the release of an iOS version. Otherwise, bugs will keep piling up. Android is coming along, but it requires developers in the community behind it. Developers of apps will need to learn about accessibility actions, Braille support, and so on. As long as iOS languishes like this, Android can and will catch up. I’m somewhat excited about macOS, the most. It has a solid base, great features already built into VoiceOver and the system, and has iOS to lean on for things like a shared image description library. That talking clock add-on for NVDA? MacOS has that built in. Not even Windows has that. VoiceOver can also resize and move items, mostly useful in apps like GarageBand, but once tactile displays hit the market, who knows what will be possible with just that kind of forward-thinking support. Oh, not to mention the Actions' rotor on macOS, moving through linked items, heading navigation in documents, just little things like that that make macOS a joy to use.

I’d love to see that kind of attention to detail in other operating systems. I’d even more so love to see that attention to good design and a kind of understanding of how to make an interface where even a blind person feels like they’re using something beautiful. An interface that makes one want to continue looking at it because it’s so good, so helpful, and at the same time, friendly and inviting. Not the cold, sterile, and rundown feel of Windows, or the somewhat empty feeling of Android.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/02/03/quick-post-on.html" 

		

Params

		map[date:2023-02-03 05:58:46 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/02/03/quick-post-on.html iscjklanguage:%!s(bool=false) lastmod:2023-02-03 05:58:46 -0600 -0600 layout:post linkedin:map[id:%!s(int=7027244012189159424) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109800687759730821) username:devinprater] medium:map[id:6c0fa7361b6c username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1801176) publishdate:2023-02-03 05:58:46 -0600 -0600 title:Quick post: On Operating Systems and Communities type:post url:/2023/02/03/quick-post-on.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bc2120), (*hugolib.pageOutput)(0xc000bc2240), (*hugolib.pageOutput)(0xc000bc2360), (*hugolib.pageOutput)(0xc000bc2480), (*hugolib.pageOutput)(0xc000bc25a0), (*hugolib.pageOutput)(0xc000bc26c0), (*hugolib.pageOutput)(0xc000bc27e0), (*hugolib.pageOutput)(0xc000bc2900)}, pageOutput:(*hugolib.pageOutput)(0xc000bc2120), pageCommon:(*hugolib.pageCommon)(0xc000635400)}	
		

Google: Full Speed Ahead

For years now, Google has been seen, for good reasons I’d say, as moving very slowly with accessibility. TalkBack would get updates in fits and starts, but otherwise didn’t seem to have people that could devote much time to it. Starting a few years ago with multi-finger gestures, TalkBack development began picking up steam, and to my surprise and delight and relief, it has not slowed down. They seem to spend as much time resolving issues as they spend creating new features and experiences. This was highlighted in the new TalkBack update that began rollout on January 9.

On that day, there was a TalkBack update from Google (not Samsung) which bumped the version to TalkBack 13.1. New in this version is the ability to use your HID Braille display over USB. Support for Bluetooth will come when Android has Bluetooth drivers for them. That alone is worth an update. But there’s more! New in TalkBack is the ability to spell check messages, notes, and documents. That alone was worth two major iOS updates to complete. But there’s more! Now, we can use actions the same way iOS does. That alone would have been worth several updates. Now, we have many more languages available for Braille users. We can now switch the direction of panning buttons. On the Focus braille display, the right whiz-wheel type buttons now pan, giving two ways to pan text. We can now move from one container to another, just like in iOS.

Now, I know that was a lot of info, in just a minor version bump. So let’s unpack things a bit. I’ll describe the new features, and why they impress me a lot more than Apple’s latest offerings.

HID Braille over USB

When TalkBack’s Braille support was shown off last year, there was a lot of talk about the displays that were left out. Displays from Humanware, which use the Braille HID standard, were not included on the list. That was mainly because there are no Android Bluetooth drivers for the displays, meaning TalkBack can’t do anything with them, over Bluetooth. However, with this update, people who have these displays, like the NLS EReader from Humanware, can plug their displays into their phone through a USB-C cable. This is made much easier because the displays work through USB-C anyway, and use them with TalkBack. This is made even simpler because Android phones already use USB-C, so you don’t need an adaptor to plug your display into your phone.

This demonstrates two things, to me. First, the TalkBack team is willing to do as much as they can to support these new displays and the new standard. I’m sure they’re doing all they can to work with the Bluetooth team to get a driver made into Android 14 or 15. Second, even if the wider Android team doesn’t have something ready, the accessibility team will do whatever they can to get something to work. Since Braille is important, they released USB support for these displays now, rather than waiting for Bluetooth support later. But when they get Bluetooth support, adding that support for these displays should be easier and quicker.

Now, TalkBack’s Braille support isn’t perfect, as we’ll see soon, but when you’re walking down a path, steps are what matters. And walking forward slowly is so much better than running and falling several times and getting bugs and dirt all over you.

Spellchecking is finally here!

One day, I want to be able to use my phone as my only computing device. I would like to use it for playing games, writing blog posts like this one, web browsing, email, note-taking, everything at work, coding, learning to code, and Linux stuff. While iOS’ VoiceOver has better app support from the likes of Ulysses and such, Android is building what could ultimately provide many developers a reason to support accessibility. Another brick was just put into place, the ability to spell check.

This uses two new areas of TalkBack’s “reading controls”, a new control from which to check for spelling errors, and the new Actions control to correct the misspelling. It works best if you start from the top of a file or text field. You switch the reading control to the “Spell check” option, swipe up or down to find a misspelled word, then change the control to “actions” and choose a correction. iOS users may then say “Well yeah I can do that too”. But that’s the point. We can now even more clearly make the choice of iPhone or Android, not based on “Can I get stuff done?” but on “How much do I want to do with my phone?” and “How much control do I want over the experience?” This is all about leveling the field between the two systems, and letting blind people decide what they like, more than what they need.

Actions become instant

From what I have seen, the iPhone has always had actions. VoiceOver users could always delete an email, dismiss notifications, and reschedule reminders with the Actions rotor, where a user can swipe up or down with one finger to select an option, then double tap to activate that option. This allows blind people to perform swipe actions, like deleting a message, liking a post, boosting a toot, or going to a video’s channel. Android had them too, they were just in an Actions menu. Unless you assigned a command to it, you had to open the TalkBack menu, double tap on Actions, find the action you wanted, and then double tap. Here are the steps for a new Android user, who has not customized the commands, to dismiss a notification through the Actions menu:

  • Find the notification to be dismissed
  • Tap once with three fingers to open the TalkBack menu.
  • Double tap with one finger to open the Actions menu.
  • Swipe right with one finger to the “Dismiss” option.
  • Double tap with one finger.

Now, with the new Actions reading control, here’s how the same user will dismiss a notification:

  • Find the notification.
  • Swipe up with one finger to the “dismiss” option.
  • Double tap with one finger.

This action is one that users perform hundreds of times per day. This essential task has been taken down from five steps, to three. And, with TalkBack’s excellent focus management, once you dismiss a notification, TalkBack immediately begins speaking the next one. So to dismiss the next one, you just swipe up with one finger, then double tap again. It’s effortless, quick, and is delightfully responsive.

On Android, since actions have been rather hidden for users, developers haven’t always put them into their app. Of course, not every app needs them, but it would help apps like YouTube, YouTube Music, Facebook, GoodReads, PocketCasts, Google Messages, WhatsApp, Walmart, and Podcast Addict, to name a few. It will take some time for word of this new ability to spread around the Android developer space. For Android developers who may be reading this, please refer to this section on adding accessibility actions. That entire page is a great resource for creating accessible apps. It describes things clearly and gives examples of using those sections in code.

Interestingly, the other method of accessing actions is still around. If you have an app, like Tusky, which has many actions, and you want to access one at the end of the list, you can still open the Actions menu, find the action you want, and double tap. In Android, we have options.

New Languages and Braille features

One of the critical feedback from users of Braille support is that there were only about four languages supported. Now, besides a few like Japanese and Esperanto, we have many languages supported. One can add new Braille languages or remove them, like Braille tables in iOS, except everyone knows what a language, in this instance, means, but very few know what a Braille table is. That goes into the sometimes very technical language that blindness companies use in their products, from “radio button” to “verbosity” which I should write about in the future. For now, though, Google named its stuff right, in my opinion.

In the advanced screen of Braille settings, you can now reverse the direction of panning buttons. I never liked this, but if someone else does, it’s there. You can also have Braille shown on the screen, for sighted users or developers.

For now, though, if you choose American English Braille, instead of Unified English Braille, you can only use Grade one Braille, and not Grade two. However, computer Braille is now an option, so you can finally read NLS BARD Braille books, or code in Braille, on your phone. This brings textual reading a step closer on Android!

Faster and cleaner

Speed matters. Bug fixes matter. In TalkBack 13.1, Google gave us both. TalkBack, especially while writing in the Braille onscreen keyboard, is somehow even more snappy than before. That bug where if you paused speech, TalkBack from then on couldn’t read passed one line of a multi-line item, is gone. TalkBack now reads the time, all the time, when you wake up your phone as the first thing it says.

Meanwhile, if I have VoiceOver start reading a page down from the current position, it stops speaking for no reason. iOS feels old and sluggish, and I don’t feel like I can trust it to keep up with me. And I just want Apple to focus on fixing its bugs rather than working on new features. They spent resources on technology like that DotPad they were so excited about, but no blind people have this device, while their tried and true Braille display support suffers. Yeah, I’m still a bit mad about that.

The key takeaway from this section is that perhaps real innovation is when you can push out features without breaking as much stuff as you add. For blind people, a screen reader isn’t just a cool feature, or a way to look kind in the media, or a way to help out a small business with cool new tech. It’s a tool that had better be ready to do its job. Blind people rely on this technology. It’s not a fun side project, it’s not a brain experiment. It’s very practical work, that requires care for, often, people who are not like you.

Luckily, Google has blind people that work for them. And, if the past year has shown an example, they’re finally getting the resources, or attention, they need to really address customer feedback and provide blind Android users with what will make Android a great system to use.

Categories: accessibility Android blindness Google

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/01/13/google-full-speed.html" 

		

Params

		map[categories:[Google accessibility blindness Android] date:2023-01-13 00:10:41 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/01/13/google-full-speed.html iscjklanguage:%!s(bool=false) lastmod:2023-01-13 00:10:41 -0600 -0600 layout:post linkedin:map[id:%!s(int=7019546263968722945) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109680410417422268) username:devinprater] medium:map[id:a60e423e3db5 username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1783044) publishdate:2023-01-13 00:10:41 -0600 -0600 title:Google: Full Speed Ahead type:post url:/2023/01/13/google-full-speed.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb97a0), (*hugolib.pageOutput)(0xc000bb98c0), (*hugolib.pageOutput)(0xc000bb99e0), (*hugolib.pageOutput)(0xc000bb9b00), (*hugolib.pageOutput)(0xc000bb9c20), (*hugolib.pageOutput)(0xc000bb9d40), (*hugolib.pageOutput)(0xc000bb9e60), (*hugolib.pageOutput)(0xc000bc2000)}, pageOutput:(*hugolib.pageOutput)(0xc000bb97a0), pageCommon:(*hugolib.pageCommon)(0xc000634f00)}	
		

Android versus iOS: stability versus Features

So, boiling my feelings on iOS versus Android down into a simpler post, as of now, iOS = features, Android = stable and seemingly more supportive of all the PWA apps that are zipped up and put on the app stores. Apple makes me feel like Squidward when I’m typing in Braille, pressing Space with dots 4-5 to translate, one… word… at… a… time! All that just to clear the translation queue or whatever when it gets stuck or whatever. And that’s where the crap comes in. On iOS, I have no idea what’s causing these types of anger-inducing issues. Oh and the bug where if you press Enter in, say, iMessage to send a message, except a menu pops up, is still there. That bug was supposed to be fixed in the latest iOS 16.2 update. But nope. I guess VoiceOver has lived long enough to be a mess. A sluggish, frustrating mess that no amount of image and screen recognition can fix.

Meanwhile, on Android, while Braille doesn’t have nearly the amount of features, it at least doesn’t have the bugs that exist on iOS. The translation system is about as good as JAWS. It doesn’t slow down, it doesn’t get stuck needing to be practically plunged like iOS, and the only issue is that when typing a colon then a right parenthesis, it doesn’t make the smiley face, but does something like conar or con) instead. So much for UEB making Braille better for computers to prosses. I think that will get much better with a Liblouis update, though.

Now, for the part about web apps, or something similar to Electron. With the Evidation app, on iOS, you get a lot of tasks, like how you’re feeling today, or health questions, out of order. So you hear one thing, then the second, then actions for the first then, and so on. I don’t doubt that this is an accessibility issue on Evidation’s part. But if Android can get this right, even when using a Braille display, Apple can get this right as well. Besides, TalkBack is the open source app, right? Apple can even learn from it. Imagine that. I mean, I know most blind people don’t care too much about all this. Most people love their iPhones and Apple Watches and AirPods. And I respect that. They are, after all, great devices with much vendor lock-in. But as bugs pile up, as garbage begins to stink, as dishes bring some flies around, more and more people are going to Android. Like, it’s already happening. Sure, it’s not a lot, but it’s growing, 1% a year I’d guess, at the least. And then they see that TalkBack has a tutorial, for getting started, and it talks about this Braille onscreen Keyboard, using a Braille display, icon descriptions, which I might add, are quite a bit more helpful than VoiceOver’s because VoiceOver is focused on the image, and has so much data and can’t really seem to zoom in and tell that that’s just an icon.

And Google isn’t slowing down either. I read a week or so ago that Google is opening an accessibility office in London I think. Somewhere in the UK I know. I’m not sure if that’s just going to be a place where Trusted Testers can go and test things, or if there will be more to it, but that, to me, shows that they’re done napping like they were since Android 5 to 10. And I’m here for it. Yes, Google has a ton of catching up to do. But I think we’ll see them put their own spin on catching up, like describing icons first, having tools that do one thing well, but linking them all together, like Linux, rather than having VoiceOver do everything as Apple does. So, during this Christmas, I’ve gone to live with family for a while, leaving my iPhone. I don’t feel like I’ll need it for a while. And maybe, with SoundScape being slated for decommitioning, I won’t have much more of a reason to go back to the iPhone. I just have to find good headphones and a good watch for Android and I’m good. It already works with my PC and ChromeBook, much better than the iPhone works with the Mac, so I just have to get good accessories.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/17/android-versus-ios.html" 

		

Params

		map[date:2022-12-17 22:10:40 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/17/android-versus-ios.html iscjklanguage:%!s(bool=false) lastmod:2022-12-17 22:12:42 -0600 -0600 layout:post linkedin:map[id:%!s(int=7010093974702977024) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109532718405405748) username:devinprater] medium:map[id:4c411e531674 username:r.d.t.prater] microblog:%!s(bool=false) post_id:%!s(int=1757047) publishdate:2022-12-17 22:10:40 -0600 -0600 title:Android versus iOS: stability versus Features type:post url:/2022/12/17/android-versus-ios.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb8ea0), (*hugolib.pageOutput)(0xc000bb8fc0), (*hugolib.pageOutput)(0xc000bb90e0), (*hugolib.pageOutput)(0xc000bb9200), (*hugolib.pageOutput)(0xc000bb9320), (*hugolib.pageOutput)(0xc000bb9440), (*hugolib.pageOutput)(0xc000bb9560), (*hugolib.pageOutput)(0xc000bb9680)}, pageOutput:(*hugolib.pageOutput)(0xc000bb8ea0), pageCommon:(*hugolib.pageCommon)(0xc000634a00)}	
		

So I was sitting at a restaurant, waiting on my food and typing on my NLS EReader Humanware Braille display connected to my iPhone SE 2020. As I typed, I noticed that words I’d just been typing weren’t showing up. So I did the space with 4-5 “translate” command to force the stupid piece of jump to work correctly. I had to do that all throughout typing today and I just … I’m not as patient as I used to be. Like Android doesn’t have that problem. And it doesn’t have the problem where if I press Enter sometimes it pops up a menu, oh hey wasn’t that supposed to be fixed in 16.2? Well it happened today and I just quit. Ugh I get so tired of these bugs. I mean I know Android’s Braille support is new enough to not have had time to accumulate these kinds of bugs, but my goodness this has been there for long enough that you’d think they’d have fixed them by now. It just makes me not even want to use a phone.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/so-i-was.html" 

		

Params

		map[date:2022-12-16 13:24:00 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/so-i-was.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 13:24:00 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009599037729333248) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524985009209534) username:devinprater] medium:map[id:af1dbe8579a7 username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756268) publishdate:2022-12-16 13:24:00 -0600 -0600 type:post url:/2022/12/16/so-i-was.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb1c20), (*hugolib.pageOutput)(0xc000bb1d40), (*hugolib.pageOutput)(0xc000bb1e60), (*hugolib.pageOutput)(0xc000bb8000), (*hugolib.pageOutput)(0xc000bb8120), (*hugolib.pageOutput)(0xc000bb8240), (*hugolib.pageOutput)(0xc000bb8360), (*hugolib.pageOutput)(0xc000bb8480)}, pageOutput:(*hugolib.pageOutput)(0xc000bb1c20), pageCommon:(*hugolib.pageCommon)(0xc000634000)}	
		

A new beginning

So, I’m writing this from a Windows computer, using Notepad, with WinSCP providing SFTP access to the server. This won’t come as a surprise for those who follow me on Mastodon and such, but I want to put this in the blog, so everything is complete.

About half a year ago, I installed Linux. Sometimes, I get curious as to if anything has changed in Linux, or if it’s any better than it once was. And I want to know if I can tackle it, or if it’s even worth it. Half a year ago, I installed Arch using the Anarchy installer, got accessibility switches turned on, and got to work trying to use it.

Throughout my journey with Linux, I found myself having to forego things that Windows users took for granted. Stuff like instant access to all audio games for computers, regular video games which, even being accessible, used only Windows screen readers for speech. And all the tools that made life a little easier for blind people, like built-in OCR for all screen readers on the platform, different choices in Email clients and web browsers, and even stuff like RSS and Podcatcher clients made by blind people themselves, not to mention Twitter clients. Now, there is OCR Desktop, but it doesn’t come with Orca, and you must set up a keyboard command for it.

But I had Emacs, GPodder for podcasts, Firefox, Chromium when I wanted to deal with that, and Thunderbird for lagging my system every time it checked for email. It was usable, and a few blind people do make use of it as their daily driver. But I just couldn’t. I need something that’s easy to setup and use, otherwise my stress levels just keep going up as I not only have to fight with config files and all that, but accessibility issues as well.

The breaking point

A few days ago, I wanted to get my Android phone talking with my Linux computer, so that I could text, get notifications, and make calls. KDE Connect wasn’t accessible, so I tried Device Connect. I couldn’t get anything out of that, so I tried GSConnect. In order to use that Gnome extension, I needed to start Gnome. I have Gnome 40, since I’m on Arch, so I logged in using that session, and got started. Except, Gnome had become much less accessible since the last time I’d tried it. The Dash was barely usable, the top panels trapped me in them until I opened a dialog from them, and I was soon just too frustrated to go much further. And then I finally opened the Gnome Extensions app, only to find that it’s not accessible at all.

There’s only so much I can take until I just give up and go back to Windows, and that was it. It doesn’t matter how powerful a thing is if one cannot use it, and while Linux is good for simple, everyday tasks, when you really start digging in, when you really start trying to make Linux your ecosystem, you start finding barriers all over the place.

Now, I’m using Windows, have Steam installed with a few accessible video games, Google Chrome, NVDA with plenty of addons, and the “Your Phone” app on Windows and Android works great, except for calls. But it still works much better than any Linux integration I could do. Also, with Windows and Android, I can open the Android phone screen in Windows, and, with NVDA or other screen readers, control the phone from the keyboard using Talkback keyboard commands. That’s definitely not something Linux developers would have thought of.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/a-new-beginning.html" 

		

Params

		map[date:2022-12-16 09:38:45 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/a-new-beginning.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:38:45 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009542362951471104) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524099444389682) username:devinprater] medium:map[id:af92621240f username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756127) publishdate:2022-12-16 09:38:45 -0600 -0600 type:post url:/2022/12/16/a-new-beginning.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000ba65a0), (*hugolib.pageOutput)(0xc000ba66c0), (*hugolib.pageOutput)(0xc000ba67e0), (*hugolib.pageOutput)(0xc000ba6900), (*hugolib.pageOutput)(0xc000ba6a20), (*hugolib.pageOutput)(0xc000ba6b40), (*hugolib.pageOutput)(0xc000ba6c60), (*hugolib.pageOutput)(0xc000ba6d80)}, pageOutput:(*hugolib.pageOutput)(0xc000ba65a0), pageCommon:(*hugolib.pageCommon)(0xc00017ea00)}	
		

Writing Richly

Whenever you read a text message, forum post, Tweet, or Facebook status, have you ever seen some one surround a word with stars, like *this*? Have you noticed some one surround a phrase with two stars? This is a part of Markdown, a form of formatting text for web usage.

I believe, however, that Markdown deserves more than just web usage. I can write in Markdown in this blog, I can use it on Github, and even in a few social networks. But wouldn’t it be even more useful everywhere? If we could write in Markdown throughout the whole operating system, couldn’t we be more expressive? And for accessibility issues, Markdown is great because a blind person can just write to format, instead of having to deal with clunky, slow graphical interfaces.

So, in this article, I will discuss the importance of rich text, how Markdown could empower people with disabilities, and how it could work system-wide throughout all computers, even the ones in our pockets.

What’s this rich text and who needs all that?

Have you ever written in Notepad? It’s pretty plain, isn’t it? That is plain text. No bold, no italics, no underline, nothing. Just, if you like that, plain, simple text. If you don’t like plain text, you find yourself wanting more power, more ability to link things together, more ways to describe your text and make the medium, in some ways, a way to get the message across.

Because of this need, rich text was created. One can use this in Word Pad, Microsoft Word, Google Docs, LibreOffice, or any other word processor worth something. When I speak of rich text, to make things simple, I mean anything that is not plain text, including HTML, as it describes rich text. Rich text is in a lot of places now, yes, but it is not everywhere, and is not the same in the places that it is in.

So, who needs all that? Why not just stick with plain text? I mean come on man, you’re blind! You can’t see the rich text. In a way, this is true. I cannot see the richness of text, but in a moment, we’ll get to how that can be done. But for sighted people, which text message is better?

Okay, but how’s your day going?

Okay, but how’s your day going?

Okay, but how’s *your* day going?

For blind people, the second message has the word “your” italicized. Sure, we may have gotten used to stars surrounding words meaning something, but that is a workaround, and not nearly the optimal outcome of rich text.

So what can you do with Markdown? You can do plenty of stuff. You could use it for simply using one blank line between blocks of text to show paragraphs in your journal. You could use it to create headings for chapters in your book. You could use it to make links to websites in your email. You could even simply use it to italicize an emphasized word in a text. Markdown can be as little or as much as you need it to be. And if you don’t add any stars, hashes, dashes, brackets, or HTML markup, it’s just as it is, plain text.

Also, it doesn’t have to be hard. Even Emacs, an advanced text editor, gives you questions when you add a link, like “Link text,” “Link address,” and so on. Questions like that can be asked of you, and you simply fill in the information, and the Markdown is created for you.

Okay but what about us blind people?

To put it simply, Markdown shows us rich text. In the next section, I’ll talk about how, but for now, let’s focus on why. With nearly all screen readers, text formatting is not shown to us. Only Narrator on Windows 10 shows formatting with minimal configuration, and JAWS can be used to show formatting using a lot of configuration of speech and sound schemes.

But, do we want that kind of information? I think so. Why wouldn’t we want to know exactly what a sighted person sees, in a way that we can easily, and quickly, understand? Why would we not want to know what an author intended us to know in a book? We accept formatting symbols in Braille, and even expect it. So, why not in digital form?

NVDA on Windows can be set to speak formatting information as we read, but it can be bold on quite arduous to hear italics on all this italics off as we read what we write bold off. Orca can speak formatting like NVDA, as well. VoiceOver on the Mac can be set to speak formatting, like NVDA, and also has the ability to make a small sound when it encounters formatting. This is better, but how would one distinguish bold, italics, or underline from a simple color change?

Even VoiceOver on iOS, which arguably gets much more attention than its Mac sibling, cannot read formatting information. The closest we get is the phrase separated from the rest of the paragraph into its own item, showing that it’s different, in Safari and other web apps. But how is it different? What formatting was applied to this “different” text? Otherwise, text is plain, so blind people don’t even know that there is a possibility of formatting, let alone that that formatting isn’t made known to us by the program tasked with giving us this information. In some apps, like notes, one can get some formatting information by reading line by line in the Note text field, but what if one simply wants to read the whole thing?

Okay but what about writing rich text? I mean, you just hit a hotkey and it works, so what could be better than that? First, when you press Control + I to italicize, there is no guarantee that “italics on” will be spoken. In fact, that is the case in LibreOffice for Windows: you do not know if the toggle key toggled the formatting on or off. You could write some text, select it, then format it, but again, you don’t know if you just italicized that text, or removed the italics. You may be able to check formatting with your screen reader’s command, but that’s slow, and you would hate to do that all throughout the document. Furthermore, dealing with spoken formatting as it is, it takes some time to read your formatted text. Hearing descriptions of formatting changes tires the mind, as it must interpret the fast-paced speech, get a sense of formatting flipped from off to on, and quickly return to interpreting text instead of text formatting instruction. Also, because all text formatting changes are spoken like the text surrounding it, you may have to slow down your speech just to get somewhat ahead of things enough to not grow tired from the relentless text streaming through your mind. This could be the case with star star bold or italics star star, and if screen readers would use more fine control of the pauses of a speech synthesizer, a lot of the exhausting sifting through of information which is rapidly fired at us would be lessened, but I don’t see much of that happening any time soon.

Even on iOS, where things are simpler, one must deal with the same problems as on other systems, except knowing if formatting is turned on or off before writing. There is also the problem of using the touch screen, using menus just to select to format a heading. This can be worked around using a Bluetooth keyboard, if the program you’re working in even has a keyboard command to make a heading, but not everyone has, or wants, one of those.

Markdown fixes, at least, most of this. We can write in Markdown, controlling our formatting exactly, and read in Markdown, getting much more information than we ever have before, while also getting less excessive textual information, hearing “star” instead of “italics on” and “italics off” does make a difference. “Star” is not usually read surrounding words, and has already become, in a sense, a formatting term. “Italics on” sounds like plain text, is not a symbol, and while it is a formatting term, has many syllables, and just takes time to say. Coupled with the helpfulness of Markdown for people without disabilities, adding it across an entire operating system would be useful for everyone; not just the few people with disabilities, and not just for the majority without.

So, how could this work?

Operating systems, the programs which sit between you and the programs you run, has many layers and parts working together to make the experience as smooth as the programmers know how. In order for Markdown to be understood, there must be a part of the operating system that translates it into something that the thing that displays text understands. Furthermore, this thing must be able to display the resulting rich text, or Markdown interpretation, throughout the whole system, not just in Google Docs, not just in Pages, not just in Word, but in Note Pad, in Messages, in Notes, in a search box.

With that implemented, though, how should it be used? I think that there should be options. It’s about time some companies released their customers from the “one size fits all” mentality anyway. There should be an option to replace formatting done with Markdown with rich text unless the line the formatting is on has input focus, a mode for simply showing the Markdown only and no rich text, and an option for showing both.

For sighted people, I imagine seeing Markdown would be distracting. They want to see a heading, not the hash mark that makes the line a heading. So, hide Markdown unless that heading line is navigated to.

For blind people, or for people who find plain text easier to work with, and for whom the display of text in different sizes and font faces is jarring or distracting, having Markdown only would be great, while being translated for others to see as rich text. Blind people could write in Markdown, and others can see it as rich text, while the blind person sees simply what they wrote, in Markdown.

For some people, being able to see both would be great. Being able to see the Markdown they write, along with the text that it produces, could be a great way for users to become more comfortable with Markdown. It could be used for beginners to rich text editing, as well.

But, which version of Markdown should be used?

As with every open source, or heatedly debated, thing in this world, there are many ways of doing things. Markdown is no different. There is:

and probably many others. I think that Pandoc’s Markdown would be the best, most extended variant to use, but I know that most operating system developers will stick with their own. Apple will stick with Swift Markdown, Microsoft may stick with Github Markdown, and the Linux developers may use Pandoc, if Pandoc is available as a package on the user’s architecture, and if not, then it’s some one else’s issue.

Conclusion

In this article, I have attempted to communicate the importance of rich text, why Markdown would make editing rich text easy for everyone, including people with disabilities, and how it could be implemented. So now, what do you all think? Would Markdown be helpful for you? Would writing blog posts, term papers, journal entries, text messages, notes, or Facebook posts be enhanced by Markdown rich text? For blind people, would reading books, articles, or other text, and hearing the Markdown for bold, italics, and other such formatting make the text stand out more, make it more beautiful to you, or just get in your way? For developers, what would it take to add Markdown support to an operating system, or even your writing app? How hard will it be?

Please, let me know your thoughts, using the Respond popup, or replying to the posts on social media made about this article. And, as always, thank you so much for reading this post.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/writing-richly-whenever.html" 

		

Params

		map[date:2022-12-16 09:36:48 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/writing-richly-whenever.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:36:48 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009542368806752257) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524099563822948) username:devinprater] medium:map[id:351fc25a97b6 username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756126) publishdate:2022-12-16 09:36:48 -0600 -0600 type:post url:/2022/12/16/writing-richly-whenever.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb85a0), (*hugolib.pageOutput)(0xc000bb86c0), (*hugolib.pageOutput)(0xc000bb87e0), (*hugolib.pageOutput)(0xc000bb8900), (*hugolib.pageOutput)(0xc000bb8a20), (*hugolib.pageOutput)(0xc000bb8b40), (*hugolib.pageOutput)(0xc000bb8c60), (*hugolib.pageOutput)(0xc000bb8d80)}, pageOutput:(*hugolib.pageOutput)(0xc000bb85a0), pageCommon:(*hugolib.pageCommon)(0xc000634500)}	
		

Apple’s accessibility consistency

This article will explore Apple’s consistent attention to accessibility, and how other tech companies with commitments to accessibility, like Microsoft and Google, compare to Apple in their accessibility efforts. It also shows where these companies can improve their consistency, and that no company is perfect at being an Assistive Technology provider yet.

Introduction

Apple has shown a commitment to accessibility since the early days of the iPhone, and since mac OSX Tiger. Its VoiceOver screen reader was the first built-in screen reader of any usability on a personal computer and smart phone. Now, VoiceOver is on every Apple product, even the HomePod. It is so prevalent that people I know have begun calling any screen reader “VoiceOver.” This level of consistency should be congratulated in a company of Apple’s size and wealth. But is this a continual trend, and what does this mean for competitors?

This will be an opinion piece. I will not stick only to the facts as we have them, and won’t give sources for everything which I show as fact. This article is a testament to how accessibility can be made a fundamental part of a brand’s experience for effected people, so feelings and opinions will be involved.

The trend of accessibility

The following sections of the article will explore companies trends of accessibility so far. The focus is on Apple, but I’ll also show some of what its competitors have done over the years as well. As Apple has a greater following of blind people, and Applevis has documented so much of Apple’s progress, I can show more of it than I can its competitors, whose information written by their followers are scattered, thus harder to search for.

Apple

Apple has a history of accessibility, shown by this article Written just under a decade ago, it goes over the previous decade’s advancements. As that article has done, I will focus on little of a company’s talk of accessibility, but more so its software releases and services.

Apple is, by numbers and satisfaction, the leader in accessibility for users of its mobile operating systems, but not in general purpose computer operating systems. Microsoft’s Windows is used far more than Apple’s MacOS. Besides that, and services, Apple has made its VoiceOver screen reader on iOS much more powerful, and even flexible, than its competitor, Google’s TalkBack.

iOS

As iPhones were released each year, so were newer versions of iOS. In iOS 6 accessibility settings began working together, VoiceOver’s Rotor gained a few new abilities, new braille displays worked with VoiceOver, and bugs were fixed. In iOS 7, we gained the ability to have more than one high quality voice, more Rotor options, and the ability to write text using handwriting.

Next, iOS 8 was pretty special to me, personally, as it introduced the method of writing text that I almost always use now, Braille Screen Input. This lets me type on the screen of my phone in braille, making my typing exponentially faster. Along with typing, I can delete text, a word or character, and now, send messages from within the input mode. I can also change braille contraction levels, and lock orientation into one of two typing modes. Along with this, Apple added the Alex voice, its most natural yet, which was only before available on a Mac. For those who do not know braille or handwriting, a new “direct touch typing” method allows a user to type as quickly as a sighted person, if they can memorize exactly where the keys are, or have spell check and autocorrection enabled.

In iOS 9, VoiceOver users are able to choose Siri voices to speak using VoiceOver, as an extension of the list of Vocalizer voices, and Apple’s Alex voice. One can now control speech rate more easily, and the speed of speech can be greater than previously possible. One can control the time a double tap should take, a better method of selecting text, braille screen input improvements, and braille display fixes and new commands.

Then, iOS 10 arrived, with a new way to organize apps, a pronunciation dictionary, even more voices, reorganized settings, new sounds for actions, a way to navigate threaded email, and some braille improvements. One great thing about the pronunciation editor is that it does not only apply to the screen reader, as in many Windows screen readers, but to the entire system speech. So, if you use VoiceOver, but also Speak Screen, both will speak as you have set them to. This is a testament to Apple’s attention to detail, and control of the entire system.

With the release of iOS 11, we gained the ability to type to Siri, new Siri voices, verbosity settings, the ability to have subtitles read or brailled, and the ability to change the speaking pitch of the voice used by VoiceOver. VoiceOver can now describe some images, which will be greatly expanded later. We can now find misspelled words, which will also be expanded later. One can now add and change commands used by braille displays, which, yes, will be expanded upon later. A few things which haven’t been expanded upon yet are the ability to read formatting, however imprecise, with braille “status cells,” and the “reading” of Emoji. Word wrap and a few other braille features were also added.

Last year, in iOS 12, Apple added commands to jump to formatted text for braille display users, new Siri voices, verbosity options, confirmation of rotor actions and sent messages, expansion of the “misspelled” rotor option for correcting the misspelled word, and the ability to send VoiceOver to an HDMI output.

Finally, In iOS 13,Apple moved accessibility to the main settings list, out of the General section, provided even more natural Siri voices, haptics for VoiceOver, to aid alongside, or replace, the sounds already present, and the ability to modify or turn them off. A “vertical scroll bar” has also been added, as another method of scrolling content. VoiceOver can now give even greater suggestions for taking pictures, aligning the camera, and with the iPhone 11, what will be in the picture. One can also customize commands for the touch screen, braille display, and keyboard, expanding the ability braille users already had. One can even assign Siri shortcuts to a VoiceOver command, as Mac users have been able to do with Apple Script. One can now have VoiceOver interpret charts and graphs, either via explanations of data, or by an audible representation of them. This may prove extremely useful in education, and for visualizing data of any type. Speaking detected text has improved over the versions to include the detecting of text in unlabeled controls, and now can attempt to describe images as well. Braille users now have access to many new braille tables, like Esperanto and several other languages, although braille no longer switches languages along with speech.

MacOS

MacOS has not seen so much improvement in accessibility over the years. VoiceOver isn’t a bad screen reader, though. It can be controlled using a trackpad, which no other desktop screen reader can boast. It can be used to navigate and activate items with only the four arrow keys. It uses the considerable amount of voices available on the Mac and for download. It simply isn’t updated nearly as often as VoiceOver for iOS.

OSX 10.7, 10.8, and 10.9 have seen a few new features, like more VoiceOver voices, braille improvement, and other things. I couldn’t find much before Sierra, so we’ll start there.

In Sierra, Apple added VoiceOver commands for controlling volume, to offset the absence of the physical function keys in new MacBook models. VoiceOver can also now play a sound for row changes in apps like Mail, instead of interrupting itself to announce “one row added,” because Apple’s speech synthesis server on the Mac doesn’t innately support a speech queue. This means that neither does VoiceOver, so interruptions must be worked around. Some announcements were changed, HTML content became web areas, and interaction became “in” and “out of” items. There were also bug fixes in this release.

In High Sierra, one can now type to Siri, VoiceOver can now switch languages when reading multilingual text, as VoiceOver on the iPhone has been able to do since iOS 5 at least, improved braille editing and PDF reading support, image descriptions, and improved HTML 5 support.

In MacOS Mojave Apple added the beginning of new iPad apps on Mac. These apps work poorly with VoiceOver, even still in Catalina. There were no new reported VoiceOver features in this release.

This year, in MacOS Catalina, Apple added more control of punctuation, and XCode 11’s text editor is now a little more accessible, even though the Playgrounds function isn’t, and the Books app can now, after years of being on the Mac, be used for basic reading of books. Braille tables from iOS 13 are also available in MacOS.

The future of Apple accessibility

All of these changes, however, were discovered by users. Apple doesn’t really talk about all of its accessibility improvements, just some of the highlights. While I see great potential in accessible diagrams and graphs, Apple didn’t mention this, and users had to find this. Subsequently, there may be fixes and features that we still haven’t found, three versions of iOS 13 later. Feedback between Apple and its customers has never been great, and this is only to Apple’s detriment. Since Apple rarely responds to little feedback, users feel that their feedback doesn’t mean anything, so they stop sending it. Also of note is that on VoiceOver’s Mac accessibility page the “Improved PDF, web, and messages navigation” section is from macOS 10.13, two versions behind what is currently new in VoiceOver.

Another point is that services haven’t been the most accessible. Chief among them is Apple Arcade, which has no accessible games so far. Apple research, I’ve found, has some questions which have answers that are simply unlabeled buttons. While Apple TV Plus has audio description for all of their shows, this is a minor glimmer of light, shrouded by the inaccessibility of Apple Arcade, which features, now, over one hundred games, none of which I can play with any success. In all fairness, a blind person who is patient may be able to play a game like Dear Reader, which has some accessible items, but the main goal of that game is to find a word in a different color and correct it, which is completely at odds with complete blindness, but could be handled using speech parameter changes, audio cues, or other signals of font, color, or style changes.

Time will tell if this new direction, no responsibility for not only other developers’ work, but also the Mac and work done by other developers and flaunted by Apple, will become the norm. After all, Apple Arcade is an entire Tab of the App Store; inaccessibility is in plain view. As a counterpoint, the first iPhone software, and even the second version, was inaccessible to blind people, but now the iPhone is the most popular smart phone, in developed nations, for blind people.

Perhaps next year, Apple Arcade will have an accessible game or two. I can only hope that this outcome comes true, and not the steady stepping back of Apple from one of their founding blocks: accessibility. We cannot know, as no one at Apple tells us their plans. We aren’t the only ones, though, as mainstream technology media shows. We must grow accustom to waiting on Apple to show new things, and reacting accordingly, but also providing feedback, and pushing back against encroaching inaccessibility and decay of macOS.

Apple’s competitors

In this blog post, I compare operating systems. To me, an operating system is the root of all software, and thus, the root of all digital accessibility. With this in mind, the reader may see why it is imperative that the operating system be as accessible, easy and delightful to use, and promote productivity as much as possible. Microsoft and Google are the largest competitors of Apple in the closed source operating system space, so they are what I will compare Apple to in the following sections.

Google

Google is the main contributor to the Android and Chromium projects. While both are open source, both are simply a base to be worked from, not the end result. Not even Google’s phones run “pure” Android, but have Google services and probably other things on the phone as well. Both, though, have varying accessibility as well. While Apple pays great attention to its mobile operating system’s accessibility, Google does not seem to put many resources towards that. However, its Chrome OS, which is used much in education, is much more easily accessible, and even somewhat of an enjoyable experience for a lite operating system.

Android

Android was released one year after iOS. TalkBack was released as part of Android 1.6. Back then, it only supported navigation via a keyboard, trackpad, or scroll ball. It wasn’t until version 4 when touch screen access was implemented into TalkBack for phones, and up to this day, only supports commands done with one finger, two finger gestures being passed through to Android as one finger commands. TalkBack has worked around this issue by recently, in Android version 8, gaining the ability to use the finger print sensor, if available, as a gesture pad for setting options, and the ability the switch spoken language, if using Google TTS, when reading text in more than one language. TalkBack uses graphical menus for setting options otherwise, or performing actions, like deleting email. It can be used with a Bluetooth keyboard. By default, it uses Google TTS, a lower quality, offline version of speech used for things like Google Translate, Google Maps, and the Google Home. TalkBack cannot use the higher quality Google TTS voices. Instead, voices from other vendors are downloaded for more natural sound.

BrailleBack, discussed on its Google Support page, is an accessibility service which, when used with TalkBack running, provides rudimentary braille support to Android. Commands are rugged, meaningless, and unfamiliar to users of other screen readers, and TalkBack’s speech cannot be turned off while using Brailleback, meaning that, as one person helpfully provided, that one must plug in a pair of headphones and not wear them, or turn down the phone’s volume, to gain silent usage of one’s phone using braille. Silent reading is one of braille’s main selling points, but accessibility, if not given the resources necessary, can become a host of workarounds. Furthermore, brailleback must be installed onto the phone, providing another barrier to entry for many deaf-blind users, so some simply buy iPods for braille if they wish to use an Android phone for customization or contrarian reasons, or simply stick with the iPhone as most blind people do.

Now, though, many have moved to a new screen reader created by a Chinese developer, called Commentary. This screen reader does, however, have the ability to decrypt your phones if you have encryption enabled. For braille users BRLTTY is used for braille usage. This level of customization, offset by the level of access which apps have to do anything they wish to your phone, is an edge that some enjoy living on, and it does allow things like third-party, and perhaps better screen readers, text to speech engines, apps for blind people like The vOICe which gives blind people artificial vision, and other gray area apps like emulators, which iOS will not accept on the App Store. Users who are technically inclined do tend to thrive on Android, finding workarounds a joy to find and use, whereas people who are not, or are but do not want to fiddle with apps to replace first-party apps which do not meet the needs of the user, and unoptimized settings, find themselves doing more configuring of the phone than using it.

Third party offerings, like launchers, mail apps, web browsers, file managers, all have variable accessibility, which can change from version to version. Therefore, one must navigate the shifting landscape of first party tools which may sort of be good enough, third party tools which are accessible enough but may not do everything you need, and tools which users have found workarounds for using them. Third party speech synthesizers are also hit or miss, with some not working at all, others, like Eloquence, being now unsupported, and more, like ESpeak, sounding unnatural. The only good braille keyboard which is free hasn’t been updated in years, and Google has not made one of their own.

Because of all this, it is safe to say that Android can be a powerful tool, but has not attained the focus needed to become a great accessibility tool as well. Google has begun locking down its operating system, taking away some things that apps could do before. This may come to inhibit third party tools which blind people now use to give Android better accessibility. I feel that it is better to have been on iOS, where things are locked down much, but you have, at least somewhat, a clear expectation of fairness on Apple’s part. Android is not a big income source for Google, so Google does not have to answer to app developers.

Chrome OS

Chrome OS is Google’s desktop operating system, running Chrome as the browser, with support for running Android apps. Its accessibility has improved plenty over the years, with ChromeVox gaining many features which make it a good screen reader. You can read more about chromeVox One of the main successes to ChromeVox is its braille support. It is normal for most first-party screen readers to support braille nowadays. When one plugs in a braille display to a Chromebook with ChromeVox enabled, ChromeVox begins using that display automatically, if it is supported. The surprise here is that if one plugs it in when ChromeVox is off, ChromeVox will automatically turn on, and begin using the display. This is beyond what other screen readers can do. ChromeVox, and indeed TalkBack, do not yet support scripting, editing punctuation and pronounciation speech, and do not have “activities” as VoiceOver for iOS and Mac have, but ChromeVox feels much more polished and ready for use than TalkBack.

The future of Google accessibility

Judging by the past, Google may add a few more features to TalkBack, but less than Apple adds to iOS. They have much to catch up on, however, as they have only two years ago added the ability for TalkBack to detect and switch languages, and use the finger print sensor like VoiceOver’s rotor. I have not seem much change over the two years since, except making a mode for tracking focus from a toggle to a mandatory feature. I suspect that, in time, they will remove the option to disable explore by touch, if they’ve not already.

With Chrome OS, and Google Chrome in general, I hope that the future brings better things, now that Microsoft is involved in Chromium development. It could become even more tied to web standards. Perhaps ChromeVox will gain better sounding offline voices than Android’s lower quality Google TTS ones, or gain sounds performed using spacial audio for deeper immersion.

Microsoft

Microsoft makes only one overarching operating system, with changes for XBox, HoloLens, personal computers, and other types of hardware. Windows has always been the dominant operating system for general purpose computing for blind people. It hasn’t always been accessible, and it is only in recent years that Microsoft have actively turned their attention to accessibility on Windows and XBox.

Now, Windows’ accessibility increases with each update, and Narrator becomes a more useful screen reader. I feel that, in a year or so, blind people may be trained to use Narrator instead of other screen readers on Windows.

Windows

In the early days of Windows, there were many different screen readers competing for dominance. JAWS, Job Access with Speech, was the most dominant, with Window-Eyes, now abandoned, as second. They gathered information from the graphics card to describe what was on the screen. There were no accessibility interfaces back then.

Years later, when MSAA, Microsoft Active Accessibility, was created, Window-Eyes decided to lean on that, while JAWS continued to use video intercept technology to gather information. In Windows 2000, Microsoft shipped a basic screen reader, Narrator. It wasn’t meant to be a full, useful screen reader, but one made so that a user could set up a more powerful one.

Now, we have UI Automation which is still not a very mature product, as screen readers are still not using it for everything, like Microsoft Office. GW Micro, makers of Window-eyes, bonded with AI Squared, producers of the ZoomText magnifier, which was bought by Freedom Scientific, whom promptly abandoned Window-eyes. These days, JAWS is being taken on by NVDA, Nonvisual Desktop Access, a free and open source screen reader, and Microsoft’s own Narrator screen reader.

In Windows 8, Microsoft began adding features to Narrator. Now, in Windows 10, four years later, Narrator has proven itself useful, and in some situations, helpful in ways that all other screen readers have not been. For example, one can install, setup, and begin using Windows 10 using Narrator. Narrator is the only self-described screen reader which can, with little configuration, show formatting not by describing it, but by changing its speech parameters to “show” formatting by sound. The only other access technology which does this automatically is Emacspeak, the “complete audio desktop.” Narrator’s braille support must be downloaded and installed, for now, but is still better than Android’s support. Narrator cannot, however, use a laptop’s trackpad for navigation. Instead, Microsoft decided to add such spacial navigation to touchscreens, meaning that a user must reach up and feel around a large screen, instead of using the level trackpad as a smaller, more manageable area.

Speaking of support, Microsoft’s support system is better in a few ways. First, unlike Apple, their feedback system allows more communication between the community and Microsoft developers. Users can comment on issues, and developers can ask questions, a bit like on Github. Windows Insider builds come with announcements by Microsoft with what is new, changed, fixed, and broken. If anything changes regarding accessibility, it is in the release notes. Microsoft is vocal about what is new in accessibility of Windows, in an era when many other companies seem almost ashamed to mention it in release notes. This is much better than Apple’s silence on many builds of their beta software, and no notice of accessibility improvements and features at all. Microsoft’s transparency is a breath of fresh air to me, as I am much more confident in their commitment to accessibility for it.

Their commitment, however, doesn’t seem to pervade the whole company. The Microsoft Rewards program is hard to use for me, and contains quizzes where answers must be dragged and dropped. This may be fun for sighted users, but I cannot do them with any level of success, so they aren’t fun for me at all. Another problem is the quality of speech. While Apple has superb speech options like Macintalk Alex, Vocalizer, or the Siri voices, Microsoft’s offline voices sound bored, pause for too long, and have a robotic buzzing sound as they speak. I think that a company of Microsoft’s size could invest in better speech technology, or make their online voices available for download for offline use. Feedback has been given about this issue, so perhaps the next version of Windows will have more pleasant speech.

Windows has a few downsides, though. It doesn’t support sound through its Linux subsystem, meaning I cannot use Emacs, with Emacspeak. Narrator does not yet report when a program opens, or when a new window appears, and other visual system events. Many newer Universal Windows apps can be tricky to navigate, and the Mail app still automatically expands threads as I arrow to them, which I do not want to happen, making the mail app annoying to use.

The future of Microsoft accessibility

I think that the future of Microsoft, regarding accessibility, is very bright. They seem dedicated to the cause, seeking feedback much more aggressively than Apple or Google, and many in the blind community love giving it to them. Windows will improve further, possibly with Narrator gaining the ability to play interface sounds in immersive audio using Windows Sonic for Headphones, braille becoming a deeper, and built in part of Narrator, and higher quality speech made available for download. Since Microsoft is also a gaming company, it could work on creating sound scapes for different activities: browsing the web, writing text, coding, reading, to aid in focus or creativity. Speech synthesis could be given even more parameters for speaking even more types of formatting or interface item types. really, with Microsoft’s attention to feedback, I feel that their potential is considerable for accessibility. Then again, it is equally possible that Apple will implement these features, but they aren’t as inviting as Microsoft when it comes to sharing what I’d love in an operating system as Microsoft has been, so I now just report bugs, not giving Apple new ideas.

Conclusion

It may be interesting to note the symmetry of accessibility: Apple’s phone is the dominant phone, but Microsoft’s Windows platform is the dominant laptop and desktop system among blind people. Apple’s iPhone is more accessible than Google’s Android, but Google’s Chrome OS is more polished and updated accessibility-wise than Apple’s MacOS. Personally, I use a Mac because of its integration with iOS Notes, Messages, Mail, and other services, the Mail app is a joy to breeze through email with, and open source tools like Emacs with Emacspeak do not work as well on Windows. Also, speech matters to me, and I’d probably fall asleep much more often hearing Microsoft’s buzzing voices than the somewhat energetic sound of Alex on the Mac, who speaks professionally, calmly, and never gets bored. I do, however, use Windows for heavy usage of the web, especially Google web apps and services, and gaming.

Time will tell if companies continue in their paths, Apple forging ahead, Microsoft burning bright, and Google… being Google. I hope, nevertheless, that this article has been useful for the reader, and that my opinions have been as fair as possible towards the companies. It should be noted that the accessibility teams for each company are individuals, have their own ideas of what accessibility is, means, and should be, and should be treated with care. After all, this past decade has been a long journey of, probably, most effort spent convincing managers that the features we now have are worth spending time on, and answering user complaints of “my phone is talking to me and i want it turned off right now!”.

This does not excuse them for the decay of Android and Mac accessibility, and the lack of great speech options on Windows. It does not excuse them for Apple Arcade’s lack of accessible games, or Microsoft Rewards’ inaccessible quizzes. We must give honest, complete, and critical feedback to these people. After all, they do not know what we need, what will be useful, or, if we dare tell, what will be delightful for us to use, unless we give them this feedback. This applies to all software, whether it be Apple’s silent gathering of feedback, Microsoft’s open arms and inviting offers, or open source software’s issue trackers, Discord servers, mailing lists, and Github repositories. If we want improvement, we must ask for it. If we want a better future, we must make ourselves heard in the present. Let us all remember the past, so that we can influence the future.

Now, what do you think of all this? Do you believe Apple will continue to march ahead regarding accessibility, or do you think that Microsoft, or even Google, has something bigger planned? Do you think that Apple is justified in their silence, or do you hope that they begin speaking more openly about their progress, at least in release notes? Do you like how open Microsoft is about accessibility, or do they even talk about accessibility for blind users enough to you? I’d love to know your comments, corrections, and constructive criticism, either in the comments, on Twitter, or anywhere else you can find me. Thanks so much for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/apples-accessibility-consistency.html" 

		

Params

		map[date:2022-12-16 09:35:37 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/apples-accessibility-consistency.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:35:37 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009541861090426881) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524091601855136) username:devinprater] medium:map[id:29e86913de1c username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756121) publishdate:2022-12-16 09:35:37 -0600 -0600 type:post url:/2022/12/16/apples-accessibility-consistency.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000ba77a0), (*hugolib.pageOutput)(0xc000ba78c0), (*hugolib.pageOutput)(0xc000ba79e0), (*hugolib.pageOutput)(0xc000ba7b00), (*hugolib.pageOutput)(0xc000ba7c20), (*hugolib.pageOutput)(0xc000ba7d40), (*hugolib.pageOutput)(0xc000ba7e60), (*hugolib.pageOutput)(0xc000bb0000)}, pageOutput:(*hugolib.pageOutput)(0xc000ba77a0), pageCommon:(*hugolib.pageCommon)(0xc00017f900)}	
		

Advocacy of open source software

In this post, I’ll detail my experiences of advocating for accessibility in open source software, why it is important, and how others can help. I’ve not been doing it for long, but at least now, I’ve done a bit. I’ll also touch upon why I think open source software, on all operating systems, is important, and what closed source and closed feedback systems cannot offer, which open source grants. On the other hand, there are things which closed source somewhat grants, but which has faltered slightly in recent days. I will attempt to denote what is fact and what is opinion, this goes for any post of a commentary of informative nature.

The Appeal of Open Source

Open source, or free software, basically means that a person can view and change the source code of software that they download or own. While this doesn’t mean much to users, it does mean that many different people can work on a project to make it better. This has no value on its own, see the “heartbleed” SSL bug and its Aftermath, but as with SSL, things can obviously improve when given an incentive.

For now, open source technology is used in many closed source operating systems. For example, the Liblouis braille tables are used in iOS, macOS, and most Linux distributions through BRLTTY. While the software is not perfect, it is often made for more than one operating system, has a helpful community of users, and, greatest for accessibility, developers who are more likely to consider accessibility. This is greatly improved with platforms for open source development, like Github and Gitlab, which allow users to post “issues” on projects, including accessibility ones.

The Appeal of Closed Source

People like getting paid. I should know, as a working blind person who does love getting paid for time and effort well spent. People love keeping things hidden while being worked on. I wouldn’t want a reader reading an incomplete blog post, after all, and spreading the word that “Devin just kind of wrote a few words and that’s all I got from the blog.” People love being able to claim their work as theirs, instead of having to share the credit with other people or companies. I don’t have direct experience with this, because I need all the help I can get, but in my opinion, it is a factor in choosing to create on your own, as a user or a company. Another great thing about closed source is that your competitors can’t copy what you’re doing, as you do it, and when you’re an important company, with allegiance to your shareholders, you must do anything to keep making money. But, what about accessibility?

Open Source Accessibility

Accessibility of open source projects vary a lot. For example, before Retroarch was made accessible, its interface was not usable by blind people. Now, though, I can use it easily. However, current versions of the KDE Plasma desktop do not work well with the Orca screen reader. The following quote is from the release notes for KDE’s latest desktop version:

#+beginquote KDE is an international technology team that creates free and open source software for desktop and portable computing. Among KDE’s products are a modern desktop system for Linux and UNIX platforms, comprehensive office productivity and groupware suites and hundreds of software titles in many categories including Internet and web applications, multimedia, entertainment, educational, graphics and software development. KDE software is translated into more than 60 languages and is built with ease of use and modern accessibility principles in mind. KDE’s full-featured applications run natively on Linux, BSD, Solaris, Windows and Mac OS X. #+endquote

Source

“Modern accessibility principals,” you say? In my opinion, we seem to be talking about different definitions of “accessibility.” Yes, there are multiple definitions. One is accessibility in the sense of being able to be accessed, another is the ability to be found, and the ability of being easy to deal with. As stated in the About section of the site, I use accessibility to mean being able to be used completely by blind people. This carries with it the implication that every single function, and all needed visual information, can be conveyed to a blind person in order for it to be accessible. This rules out the “good enough” approach that so many blind people accept as the status quo. Luckily for blind people who would love to use KDE, there is Work being doneon this issue.

Gnu, the project behind much of Linux, also has anAccessibility Statement which does seem to be very out of date, as it references flash player and Silverlight, which are no longer in common use, and does not reference Apple’s iOS, Google’s Android, and other modern technologies which are not open source (or are, but might as well not be because of the necessity of closed-source services), but which include assistive technologies. I encourage every adventurous blind person to make themselves available for testing open source software and operating systems; user testing was mentioned by the KDE team as something blind people could do to help. Believe me, having an environment which is a “joy to use” is a dream of mine.

Gnome, and Mate, accessibility are okay, but they do not come close to the accessibility of Windows and Mac systems. For a good example, if you press Alt + F1 in Gnome, and probably Mate too (tested, Mate works a lot better than Gnome), you may only hear “window.” Advanced users will know to type something in Gnome, or use the Arrow Keys in Mate, but regular users should not have to learn to hunt around due to bad accessibility, and the fact that less technically inclined users use Linux is a testament to blind people’s ingenuity and ability to adapt, rather than the accessibility of the platform.

Open source accessibility is so hit and miss because there are so many standards. There is the GTK framework for building graphical apps, which does have some accessibility support, but developers must label the items in their programs with text. There is the QT framework, which seems to have more poor accessibility support. Basically, developers can do anything they want, which is good for freedom, but often is not great for accessibility. Also, much of the community has not heard of accessibility practices, do not know that blind people use computers, or think that we must use braille interfaces to interact with computers and digital devices. This is a failure on our part, as we do not “get out there” on the Internet enough. With the advent of an accessible Reddit clientthis may begin to change. Further work must be done to give blind users an accessible Reddit interface on the web for users to use on computers, not iPhones. However, Github is very accessible, and there is nothing stopping one from submitting issues.

Closed Source Accessibility

“Okay but what about Windows? And Apple? You like Apple, right?” Basically, it’s hard to tell. Software doesn’t write itself, it is written, for now, by people. People can make mistakes, ignore guidelines, or simply not care about accessibility. However, those guidelines do exist, and are usually one standard, like the iOS accessibility standard. This means that companies can develop accessible software easily, and are held accountable by managers to uphold accessibility. But, even the best of accessible companies do not always do the right thing. Apple, for example, has created two services, Apple Arcade and Apple Research. Apple Arcade contains no games which a blind gamer can play without expending much more effort than a sighted gamer. Apple Research contains some questions with answer buttons which are not labeled, or cannot be activated. Does Apple think that blind people do not want to game, or that we don’t care about our hearing, heart, or for women, their reproductive health? Apple has also created Swift Playgrounds, an app for children to learn to code. This is accessible. But what about adults? Shouldn’t blind adults, who are usually technically inclined enough, be given a chance to learn to code? I’ll probably rant about this in a future article.

Microsoft has been on an accessibility journey for a few years now, but even they have a few problems. First, the voices in Windows 10 are poor for screen reading tasks. They pause way too long at the end of clauses and sentences, leading me, at least, to press Down Arrow to move to the next line before the last line was actually done being spoken, all because it paused just long enough to make me think that there was no more text to speak. Microsoft’s XBox Game Pass is great, but I could not find any accessible games in the free rotations. Sure, there’s Killer Instinct that many blind people can enjoy playing, but I found it not only inaccessible, as the menus do not speak, but boring, as the characters all seemed to simply do the same thing. I know that games do not have to be accessible to be fun, but I expect companies who showcase games, like Apple with Arcade, to have at least one accessible game for blind people to enjoy. And I also know that neither Apple nor Microsoft makes these games, but they do choose to advertise them, endorse them even, and it shows that, for Apple Arcade at least, video games are not something which they expect blind people to play. Microsoft is proving them wrong, with the release of Halo with screen reader usability in menus, and the possibility that the new Halo game will be accessible.

Another problem with Microsoft is that not all of their teams are onboard. Like Apple with Arcade and Research, Microsoft has the Rewards team. Their quizzes require one to move items around to reorder answers to get the quiz correct. This may be easy, and perhaps fun, for sighted people, but is simply frustrating for blind people. Other problems include the release of the new Microsoft Edge, which, for most users of screen readers, require that the user turn off UI Automation in order to read some items on the web. Otherwise, if Microsoft’s upcoming foldable phone comes with greatly enhanced accessibility relative to pure Android, and the Narrator screen reader, optimized and made great and enjoyable for a mobile experience, I think that Microsoft could take plenty of market share back from Apple of mobile phone users. Update: It’s barely any better than any other Android phone, so Apple still wins. They already have most general purpose computer users who are blind, so taking from Apple would be a huge win for them regarding accessibility. But, on that, we’ll have to wait and see how far Microsoft takes their commitment to accessibility. The more cynical side of me says that Microsoft will simply slap Android on a folding phone and release it, because why fight Apple.

Reporting Bugs

So, what can we do to make accessibility better? Just about all open source software, previously including the stuff making up this blog, is hosted on Github. Just about all companies, of closed source software, claim to want your feedback. So, I recommend giving them any feedback you have. I know that giving feedback to Apple is like throwing $100 bills into the ocean, giving your valuable time to something which may offer no results, and just gives you the robotic “thanks” message. I know that sometimes talking to Microsoft’s accessibility team may seem unproductive, because they lead you from Twitter to one of a number of feedback locations. I know that feedback to open source software projects may take a lot of time and explaining and promoting accessibility to a community which has never considered it before, but it all may help.

For a great, and successful, Github issue regarding accessibility, see this issue on accessibility of Retroarch. You can see that I approached the Retroarch team respectfully, with knowledge of basic accessibility and computer terminology. Note that I gave what should happen, what is happening, and what can be done to fix the problem. As the saying goes, if you do not contribute to a solution to a problem, you are a part of the problem. Blind people will need to remember to give solutions, not just whine about something not working and can’t play Poke A Man like everyone else.

Also, share links to your feedback with other blind people who can vote, thumb up, or comment on it. Remember, if you do comment, please remember that feedback does not net instant results. I’m still waiting on Webcamoid to have an accessible interface. But, at least I’ll know when something changes, and I could even Pay for features to be implemented.

This is opposed to the closed source model, where feedback is “passed on to the team,” or you are thanked, by your iPhone, for your feedback, but do not hear anything back from developers, and you most definitely can not pay for specific features to be worked on, or donate to projects that you feel deserve it. You must hope and have faith that large companies with more than one billion users cares enough to hear you. For perspective, if every blind person stopped using an iPhone, Apple would not miss many lost sales, compared to the billions of sighted users. However, the engineers who work on iOS accessibility are people too, with deadlines, lives, and feelings, and we should also respect that they are probably tightly restricted in answering feedback, fixing bugs, and creating new, exciting features.


As for me, I will continue to support open source software. I’ll keep using this mac and iPhone because they work the best for me and what I do for work and writing. But, believe me, when something better comes along, I’ll jump ship quickly. As blind people, I feel, we cannot afford to develop brand loyalty. Apple, Microsoft, or Google, I think, could drop accessibility tomorrow, and there we’d be, left in the cold. I highly doubt they will. They may let it lie stagnant, but they probably won’t remove it. I do not write this to scare you in the least, but to make you think about how much control you actually have over what you use, how companies and developers view us, and how we can improve the situation for ourselves. if sighted people notice a bug or want a feature in iOS or Windows, they can gather their tech press and pressure Apple or Microsoft. If we find an accessibility bug, do we have enough clout, or unity, to pressure these companies? Writing feedback, testing software, trying new things, writing guides and fixing documentation, or, if able, translating software into other languages are all things that any blind person can do. I’m not saying that I’m perfect at any of this. I just think that we as a community can grow tremendously if we strike out from our comfortable Windows PC’s, Microsoft Word, audio games, TeamTalk, and old speech synthesizers.

I’ll give some projects you could try out and give feedback on:

Element chatting service

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/advocacy-of-open.html" 

		

Params

		map[date:2022-12-16 09:34:06 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/advocacy-of-open.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:34:06 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009541186457608193) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524081064849630) username:devinprater] medium:map[id:6836f702e6c8 username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756116) publishdate:2022-12-16 09:34:06 -0600 -0600 type:post url:/2022/12/16/advocacy-of-open.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000ba6ea0), (*hugolib.pageOutput)(0xc000ba6fc0), (*hugolib.pageOutput)(0xc000ba70e0), (*hugolib.pageOutput)(0xc000ba7200), (*hugolib.pageOutput)(0xc000ba7320), (*hugolib.pageOutput)(0xc000ba7440), (*hugolib.pageOutput)(0xc000ba7560), (*hugolib.pageOutput)(0xc000ba7680)}, pageOutput:(*hugolib.pageOutput)(0xc000ba6ea0), pageCommon:(*hugolib.pageCommon)(0xc00017f400)}	
		

On the stagnation of screen readers

If you have sight, imagine that in every digital interface, the visuals are beamed directly into your eyes, into the center and peripheral vision, blocking out much else, and demanding your attention. All “visuals” are mostly text, with a few short animations every once in a while, and only on some interfaces. You can’t move it, unless you want to move everything else, like videos and games. You can’t put it in front of you, to give you a little space to think and consider what you’re reading. You can’t put it behind you. You can make it softer, though, but there comes a point where it’s too soft and blurry to see.

Also imagine that there is a form of art that 95% of other humans can produce and consume, but for you is either blank or filled with meaningless letters and numbers ending in .JPEG, .PNG, .BMP, or other computer jargon, and the only way to perceive it is to humbly ask that the image is converted to the only form of input your digital interface can understand, straight, plain text. This same majority of people have access to everything digital technology has to offer. You, though, have access to very little in comparison. Your interface cannot interpret anything that isn’t created in a standards-compliant way. And this culture, full of those who need to stand out, doesn’t like standards.

There is, though, a digital interface built by Apple which uses machine learning to try to understand this art, but that’s Apple only, and they love control too much to share that with other interfaces on other company’s systems. And there are open source machine learning models, but the people that could use it are too busy fixing their interface to work with breaks in operating system behaviour and UI bugs to research that. Or you could pay $1099, or $100 per year, for an interface that can describe the art, by sending it to online services of course, and get a tad bit more beauty from the pervasive drab, plain text.

Now, you can lessen the problem of eye strain, blocked out noise, and general information fatigue by using a kind of projector, but other people see it too, and it’s very annoying to those who don’t need this interface, with its bright, glaring lights, moving quickly, dizzyingly fast. It moves in a straight line, hypnotically predictable, but you must keep up, you must understand. Your job relies on it. You rely on it for everything else too. You could save up for one of those expensive interfaces that show things more like print on a page… if the page had only one small line and was rather slow to read, but even that is dull. No font, no true headings, no beauty. Just plain, white on black text, everywhere. Lifeless. Without form and void. Deformed and desolate. Still, it would make reading a little easier, even if it is slower. But you don’t want to be a burden to others or annoy them, and you’ve gotten so used to the close, direct, heavy mode of the less disruptive output that you’re almost great at it. But is that the best for you? Is that all technology can do? Can we not do better?


This is what blind people deal with every day. From the ATM to the desktop workstation, screen readers output mono, flat, immovable, unchanging, boring speech. There is no HRTF for screen readers. Only one can describe images without needing to send them to online services. Only a few more can describe images at all. TalkBack, a mobile screen reader for Android, and ChromeVox, the screen reader on Chromebooks, can’t even detect text in images, let alone describe images. Update: TalkBack can read text and icons now, but not describe images. ChromeVox still can’t do any of that. All of them read from top to bottom, left to right, unless they are told otherwise. And they have to be specifically told about everything, or it’s not there. We can definitely do better than this.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/on-the-stagnation.html" 

		

Params

		map[date:2022-12-16 09:32:33 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/on-the-stagnation.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:32:33 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009540792423731201) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524074910355242) username:devinprater] medium:map[id:4b6e6dc86a3b username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756115) publishdate:2022-12-16 09:32:33 -0600 -0600 type:post url:/2022/12/16/on-the-stagnation.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb0a20), (*hugolib.pageOutput)(0xc000bb0b40), (*hugolib.pageOutput)(0xc000bb0c60), (*hugolib.pageOutput)(0xc000bb0d80), (*hugolib.pageOutput)(0xc000bb0ea0), (*hugolib.pageOutput)(0xc000bb0fc0), (*hugolib.pageOutput)(0xc000bb10e0), (*hugolib.pageOutput)(0xc000bb1200)}, pageOutput:(*hugolib.pageOutput)(0xc000bb0a20), pageCommon:(*hugolib.pageCommon)(0xc000232f00)}	
		

Response to “Why Linux Is More Accessible Than Windows and MacOS”

Today, I came across an article called Why Linux Is More Accessible Than Windows and macOS. Here, I will give responses to each point of the article. While I applaud the author’s wish to promote Linux, I think the points given are rather shallow and very general in nature, and could be given about any computing operating system comparison.

1. The Power of Customization

In this section, the author argues that, while closed source systems do have accessibility options, people with disabilities, (who the author calls “differently abled, which some people with disabilities would consider ableism due to the fact of differently abled feeling more like inspiration porn), have to compromise on what modifications they can make to their closed source operating systems. This can be true, but from my experience using MacOS, Windows, iOS, Android, and Linux, closed source systems have a wider community of people with disabilities using them, thus have addons and extensions to allow for as few compromises to the user’s experience as possible.

Another point that must be kept in mind is that Linux is not the most user-friendly OS yet. The modifications that can be made with Linux are more than in MacOS and Windows, yes. But I, for example, want to hold down the Space bar and have that register as holding the Control key. I probably cannot do that in Windows and MacOS. I surely can do it in Linux, but it would take a lot of learning about key codes and how to change keyboard maps throughout the console and X/Wayland sessions. The GUI will not provide this ability. The best I can do with the GUI is change Capslock to Control.

Also, let’s say a new user installs a distribution like Fedora Linux, and needs a screen reader, or any accessibility service. The user has done a little homework, so knows to turn on Orca with Alt + Super + S. The user then launches firefox from the “run application” dialog. And it doesn’t work. Nothing reads. Or the user runs a media player, and gets the same result. Why is this? I’ll spare you the arduous digging needed to find the answer. In the personalization menu of a desktop’s system menu, or in the Assistive Technologies dialog, there is a checkbox which needs to be checked in order to even enable the assistive technology to work correctly with the rest of the system. The user has to know that it’s there, how to get to it in the chosen desktop environment, and has to know how to check the box and close the dialog. This, before even doing anything else with their system.

This means that, out of the box, on almost all Linux distributions, this one key shows that the Linux GUI, by nature of needing this box to be checked, is hostile to people with disabilities. Can distribution maintainers check this box by default? Yes. Do they? No. Does this box even need to be there? No. Assistive Technologies could be enabled by default, with advanced users, after receiving warning in comments of a configuration file, able to disable it, only via changing the configuration file.

2. Linux Is Stable and Reliable

About fifteen minutes ago, I was using Gmail within the latest Google Chrome on Fedora Linux. Suddenly, the screen reader, Orca, stopped responding as I tried to move to the next heading in an email. I switched windows, and nothing happened. I got speech back in a good 20 seconds, but that shows that Linux isn’t quite as stable as the author may believe. At least, not every distribution.

My experience is my own; I do not claim to be an expert in Linux usage or administration. But this is still my experience; while Linux is stable, and I can use it for work purposes, it is not as stable, especially in the accessibility department, as Windows or MacOS. I would say, though, that it is more usable than MacOS, where just about anything in Safari, the web browser, results in Safari going unresponsive for a good five seconds or more.

Another important point is that while many developers hammer away at the core of Linux, how many people maintain ATSPI, the Linux bridge between programs and accessibility services? How many people make sure the screen reader is as lean and performant as possible? How many people make sure that GTK is as quick to give information on accessibility as it is to draw an image? How many people make sure that when a user starts a desktop, that focus is set somewhere sensible so that a screen reader reads something besides “window”? My point is, open source is full of people that work on what they want to work on. If a developer isn’t personally impacted by accessibility needs, that developer is much less likely to code with accessibility in mind. So let’s stop kidding ourselves into thinking that overall work on Linux includes even half the needed work on accessibility specifically.

While Linux’s accessibility crawls towards betterment at about one fix per month or two, Windows and MacOS have actual teams of people working specifically on accessibility, and a community of disabled developers working on third-party solutions to any remaining problems. Do all the problems get fixed? No, especially not in MacOS. But the fact that the more eyes on a problem there are, the more things get noticed applies significantly to accessibility.

3. Linux Runs on Older Hardware

This section is one I can agree with completely. Linux running on old hardware is what will drive further adoption when Windows 11 begins getting more features than Windows 10. This is even more important for people with disabilities, who usually have much less money than people without disabilities, so cannot upgrade computers every year, or even every three or five years.

4. Linux Offers Complete Control to the Users

This is true if the user is an advanced Linux user. If the user is just starting out with Linux, or even just starting out with computers in general, it is very false. How would it feel to be trapped in a place without a gate, without walls, without doors, without windows? That’s how a new computer user would feel when dealing with Linux, especially if the person is blind, and thus needs to know how to use the keyboard, what the words the speech is saying mean, what all the terminology means, but not even knowing where the Space bar is, or even how to turn the computer on.

This is a huge issue for every operating system, but was somewhat solved by MacOS by adding a wonderful tutorial for VoiceOver, its screen reader, and guiding the user to turn it on when the computer starts, without the user having to touch a single key.

As for this piece:

#+beginquote On the other hand, Linux shares every line of code with the user, providing complete control and ownership over the platform. You can always try new technologies on Linux, given its inherent nature, compatibility, and unending support for each of its distros. #+endquote

This is practically wrong. First, new Linux users won’t understand the code that Linux “shares” with them. New Linux users will not know where to look to find this code. So, this really doesn’t help them. Open source or closed, the OS is going to be a black box to any new user. And new users are what count. If new users do not want to stay on Linux, they will not spend the time to become old users, who can then teach newer users. Also, good luck trying new technologies on Debian.

Accessibility Comparison Between Linux and Windows

Here, the author compares a few access methods. A thing the author calls “screen reader” on Linux, which I hope they know is called Orca, versus Windows Narrator, the worst option, but built in.

The author doesn’t mention NVDA on Windows, which is far more powerful than Narrator, and has several addons to enhance its functionality even further. One can add many different “voice modules” to Windows, and NVDA has plenty of addon voice modules as well, many of which are not a part of Linux, like DecTalk, Softvoice, and Acapela TTS.

Accessible-Coconut: An Accessible Linux Distribution

I’m going to be blunt here: this distribution is daded off of an old, LTS version of Ubuntu, will lack the latest version of Orca, ATSPI, GTK, and everything else. If you want something approaching good, try Slint Linux. That’s about the most user-friendly distribution for the blind out there right now. Fedora’s Mate spin is what I use, but Orca doesn’t come on at startup, and neither is assistive technology support enabled.

Linux Distros Cater to Every User Type

This summary continues the points expressed in the article, and ends with the author inviting “you” to try Linux if “you” want your computer to be more accessible. I suppose the author is pointing people to try Accessible Coconut. At this point, I would rather users do a ton of reading about Linux, the command line, Orca, all the accessibility documentation they can find, try Windows Subsystem for Linux, and then, if they want more, put Linux on a separate hard drive and try it that way. I would definitely start with Slint, or Fedora, but never with a lackluster distro like Accessible Coconut.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/response-to-why.html" 

		

Params

		map[date:2022-12-16 09:30:06 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/response-to-why.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:30:06 -0600 -0600 layout:post microblog:%!s(bool=true) post_id:%!s(int=1756112) publishdate:2022-12-16 09:30:06 -0600 -0600 type:post url:/2022/12/16/response-to-why.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb1320), (*hugolib.pageOutput)(0xc000bb1440), (*hugolib.pageOutput)(0xc000bb1560), (*hugolib.pageOutput)(0xc000bb1680), (*hugolib.pageOutput)(0xc000bb17a0), (*hugolib.pageOutput)(0xc000bb18c0), (*hugolib.pageOutput)(0xc000bb19e0), (*hugolib.pageOutput)(0xc000bb1b00)}, pageOutput:(*hugolib.pageOutput)(0xc000bb1320), pageCommon:(*hugolib.pageCommon)(0xc000233900)}	
		

Analyzing the Windows 11 Accessibility Announcement

Microsoft announced Windows 11 a few weeks ago, and, from my searches at least, still doesn’t have an audio described version of the announcement. Update: There’s one now. Anyways, they also released a post on their Windows blog about how Windows 11 was going to be the most accessible, inclusive, amazing, delightful thing ever! So, I thought I’d analyze it heading by heading to try to figure out what’s fluff and what’s actual new stuff worthy of announcement.

Beyond possible, efficient and yes, delightful

So, they’re trying to reach what the CEO in his book “hit refresh,” called the “delightful” experience he wanted to work towards. His gist was that Windows was pretty much required now, but he wanted to make it delightful. Well, the only user interface that is delightful to me is Emacspeak. MacOS and iOS come close. What makes them delightful are a few things: sound and quality speech and parameter changes. I won’t go over all that here, my site has plenty on all that already. But it’s safe to say that Microsoft isn’t going near that anytime soon.

Instead of trying to offload cognitive strain from parsing speech all day, they put even more on it. Microsoft Edge has “page loading. Loading complete.” Teams has similar textual descriptions of what’s going on. And while I appreciate knowing what’s going on, speech takes a second to happen, be heard, and be processed. Sound happens a lot quicker, and over time, a blind user can get pretty good at recognizing what’s going on. But whenever I brought this up to the VS Code team, they said something about not having the ability to add sounds, so they’d have to drag in some other dependency, so they’d have to bring that up with the team and all that. Well, they won’t become the most delightful editor for the blind any time soon. Just the most easy to use.

And, while this is partly the fault of screen reader developers who just won’t focus on sound or speech parameter changes for text formatting and such like that, Microsoft could be leading the way in that with Narrator. And yeah, they’ve got a few sounds, and their voices can change a little for text formatting, but their TTS is just too limited to make it really flexible and enjoyable. Instead of changing pitch, rate, and entonation, they change pitch, rate, and volume, and sometimes it’s jarring, like the volume changes. But there’s not really much else they can do with their current technology. I guess they’ll have to maybe change the speech synthesis engine a bit, if they’re even able to. In the past six years, I’ve not seen any new, or better, first-party voices for US English for Windows. Sure, they have their online voices, which are rather good, but they haven’t shown any inclination to bring that quality to Windows OneCore Voices.

People fall asleep listening to Microsoft David. He’s boring and should not be the default voice. While this is anecdotal, I’ve heard quite a few complaints about it, and if you listen to him for a long time, you’d probably get bored too. This is seriously not a good look, or rather, sound, for people who are newly blind and learning to use a computer without sight, or someone who doesn’t know that there are other voices, or even if Microsoft wants to demonstrate Narrator to people who haven’t used it before. And while NVDA users can use a few other voices, the defaults should really be good enough. Apple has had the Alex voice for years. Over ten years, in fact. He’s articulate, can parse sentences and clauses at a time, allowing him to intone very close to the way humans speak, with context. He’s also not the most lively voice, but he sounds professional. And, Alex is the default voice on MacOS. David, on Windows, just sounds bored. And so blind people, particularly those used to Siri and VoiceOver from iOS, just plain fall asleep. It’s nowhere near delightful.

## Windows 11 is the most inclusively designed version of Windows

Okay, sure. Even though from what I’ve heard from everyone else, it’s just the next release of Windows 10. But sure, hype it up, Microsoft, and watch the users be disappointed when they figure out that, yeah, it’s the same old bullcrap. Bullcrap that works okay, yeah, but still bullcrap.

#+beginquote People who are blind, and everyone, can enjoy new sound schemes. Windows 11 includes delightful Windows start-up and other sounds, including different sounds for more accessible Light and Dark Themes. People with light sensitivity and people working for extended periods of time can enjoy beautiful color themes, including new Dark themes and reimagined High Contrast Themes. The new Contrast Themes include aesthetically pleasing, customizable color combinations that make apps and content easier to see. #+endquote

Okay, cool, new sounds. But are there more sounds? Are there sounds for animations? Are there sounds for when copying or other processes complete? Are there sounds that VS Code and other editors can use? Are there sounds for when auto-correct or completion suggestions appear? Are their sounds for when an app launches in the background, or a system dialog appears? Are there sounds for when windows flash to get users’ attention?

#+beginquote And, multiple sets of users can enjoy Windows Voice Typing, which uses state-of-the-art artificial intelligence to recognize speech, transcribe and automatically punctuate text. People with severe arthritis, repetitive stress injuries, cerebral palsy and other mobility related disabilities, learning differences including with severe spelling disabilities, language learners and people that prefer to write with their voice can all enjoy Voice Typing. #+endquote

Um, yeah, this has been on Windows for years. Windows + H. I know. I get it.

design and user experience. It is modern, fresh, clean and beautiful.

Okay, but is it fresh, clean and beautiful for screen readers? Are there background sounds to help us focus, or maybe support for making graphs audible for blind people, or support for describing images offline? Oh wait, wrong OS, haha. Funny how Apple’s OS’ are more modern when it comes to accessibility than Microsoft.

Windows accessibility features are easier to find and use

Okay, this whole section has been talked about before, because it’s no different than the latest Windows Insiders’ build. Always note that if companies have to fill blog posts with stuff that they’ve had for like months or a year now, it means they really, really don’t have anything new to show, or say. They just talk because not doing so would hurt them even more. Contrast this with Apple’s blog post on Global Accessibility Awareness Day, where everything they talked about was new or majorly improved. And all Microsoft did that day was “listen”. There’s a point where listening has gathered enough data, and its time to act! Microsoft passed that point long ago.

#+beginquote Importantly, more than improving existing accessibility features, introducing new features and making users’ preferred assistive technology compatible with Windows 11, we are making accessibility features easier to find and use. You gave us feedback that the purpose of the “Ease of Access” Settings and icon was unclear. And you said that you expected to find “Accessibility” settings. We listened and we changed Windows. We rebranded Ease of Access Settings to Accessibility and introduced a new accessibility “human” icon. We redesigned the Accessibility Settings to make them easier to use. And of course, Accessibility features are available in the out of box experience and on the Log on and Lock screens so that users can independently setup and use their devices, e.g., with Narrator. #+endquote

So, the most important thing they’ve done this year is what they’ve already done. Got it. Oh and they changed Windows. Just for us guys. They did all that hard work of changing a name and redoing an icon, just for us! Oh so cringeworthy. This “courage” thing is getting out of hand. Also, if changing Windows is so hard, maybe it’s time to talk to the manager. Seriously. If it’s so hard to do your job that changing a label and icon is hard work, there’s something seriously wrong, and I almost feel bad for the Windows Accessibility team now.

Windows accessibility just works in more scenarios

#+beginquote Windows 11 is a significant step towards a future in which accessibility “just works,” without costly plug-ins or time-consuming work by Information Technology administrators. With Windows 10, we made it possible for assistive technologies to work with secure applications, like Word, in Windows Defender Application Guard (WDAG). With Windows 11, we made it possible for both Microsoft and partner assistive technologies to work with applications like Outlook hosted in the cloud… #+endquote

Okay, so, from Twitter, Joseph Lee has complained that the Windows UI team isn’t writing proper code to let screen readers read and interact with apps in Windows 11’s Insider builds. So right there, we’re going to still need Windows App Essentials, an NVDA add-on that makes Windows 11 a lot easier to use. This add-on is mostly for the first-party apps, like weather and calculator. So, um, what’s all this about again? So, nothing seems to be new. We will still need “costly” addons and plugins and junk. Because I don’t see Microsoft fixing those UI issues by release. System admins, keep that list of NVDA addons around, because they’ll still be needed in Windows 11.

Remote Application Integrated Locally (RAIL) using Narrator. While that may sound like a lot of jargon to most people, the impact is significant. People who are blind will have access to applications like Office hosted in Azure when they need it.

Yeah because people with disabilities are dumb and can’t understand tech speak. Sure. Okay. Keep dumbing us down, Microsoft. We really enjoy the slap in the face. Just explain the terms, like RAIL. With a quick Google search, it looks like Azure supports Ruby on Rails, so, I guess that’s what it is. Which doesn’t make much sense because Rails makes web apps, from what I understand. Ah well. Keep lording your tech knowledge over us, oh great Elites at Microsoft.

What I want to see is Electron apps getting OS-level support in accessibility, so that VS-code doesn’t have to feel like a web app, because it shouldn’t feel like that on Microsoft’s own OS.

Now, being able to host Office on a server and have Narrator, and hopefully other screen readers (because Narrator is still not good enough), support it, is nice. But that’s not really a user-facing feature. Users probably won’t know Word is hosted on a server.

#+beginquote Windows 11 will also support Linux GUI apps like gedit through the Windows Subsystem for Linux (WSL) on devices that meet the app system requirements. And, we enabled these experiences to be accessible. For example, people who are blind can use Windows with supported screen readers within WSL. In some cases, the assistive technology experience is seamless. For example, Color Filters, “just work.” Importantly, the WSL team prioritized accessibility from the start and committed to enable accessible experiences at launch. They are excited to share more with Insiders and to get feedback to continue to refine the usability of their experiences. #+endquote

In some cases… Wanna elaborate a bit, Microsoft? Will I be able to use Gedit with a screen reader? Or Kate? Or Emacs? I have gotten Emacs with Emacspeak working on WSLG in Windows Insider builds. But it’s too sluggish to be used productively. So yeah, if that’s the same experience as using a screen reader with it, I don’t see myself using it much, if at all.

experiences we introduced last week like our partnership with Amazon to bring Android apps to Windows in the coming months.

Okay, well I’m waiting. I suspect they’ll use something similar to what they did with the Your Phone app, just pipe accessibility events to the screen reader through, the title bar I think? That’ll be okay I guess, but no sound feedback would mean that the experience isn’t quite to TalkBack standards, as low as that is.

Modern accessibility platform is great for the assistive technology ecosystem

closely with assistive technology industry leaders to co-engineer what we call the “modern accessibility platform.” Windows 11 delivers a platform that enables more responsive experiences and more agile development, including access to application data without requiring changes to Windows.

I’m not going to pretend to understand that last bit, but if the UI problems found by Joseph Lee are any indication, a lot more has been broken than fixed or new. Also, which Assistive Technology industry leaders? And what biases do they have?

#+beginquote We embraced feedback from industry partners that we need to make assistive technology more responsive by design. We embraced the design constraints of making local assistive technology like Narrator “just work” with cloud hosted apps over a network. We invented and co-engineered new Application Programming Interfaces (APIs) to do both; to improve the communication between assistive technologies like Narrator and applications like Outlook that significantly improve Narrator responsiveness in some scenarios. The result is that Narrator feels more responsive and works over a network with cloud-hosted apps. #+endquote

I, as a user, don’t care about cloud-hosted apps. Office may at some point become a cloud-hosted app, and that’s what they may be preparing for, but I don’t care about that. Responsiveness is cool and good, but NVDA is very responsive, and some people still fall asleep using it. Why? Because it sounds boring! The voices in Windows suck. No audible animations or anything to make Windows delightful.

#+beginquote We also embraced feedback from industry partners that we need to increase assistive technology and application developer agility to increase the pace of innovation and user experience improvements. We made it possible for application developers, like Microsoft Office, to expose data programmatically without requiring Windows updates. With Windows 11, application developers will be able to implement UI Automation custom extensions, including custom properties, patterns and annotations that can be consumed by assistive technologies. For users, this means we can develop usability and other improvements at the speed of apps. #+endquote

At the speed of apps. That’s pure marketing crap. A lot is said in this article that is pure marketing, and not measurable fact. I want real, factual updates, not this. And the fact that they don’t provide that is a hint that they have nothing to provide. Now, having “custom” rolls and states and such things is nice for developers who have to reinvent the wheel and the atoms that make up that wheel, so maybe new applications have a chance of being accessible. But accessibility won’t happen with developers unless its in their face. They probably won’t know about these abilities, or even care in many cases.

Try Windows 11 and give us feedback

I’ve read feedback from those who have tried Windows 11 Preview. I myself can’t try it because no TPM chip and I don’t feel like being rolled back to Windows 10 when 11 is released. The feedback I’ve gotten so far from others is, well, very little, actually. From what I’ve heard, it’s still just Windows 10.

Conclusion

So, why should I even care about Windows 11? Not much is new or changed or fixed for accessibility, as this article full of many empty words shows. Six years of development, and the Mail app still has that annoying bug of expanding threads whenever keyboard focus lands on them, instead of waiting for the user to manually expand them. The Reply box still doesn’t alert screen readers that it’s opened, so it thinks its still in the message pane being replied to, and not the reply edit field. The Microsoft voices still sound pretty bad, even worse than Google’s offline TTS now, and that’s pretty bad.

Will any of this change? I doubt it. I’ve lost a lot of confidence in Microsoft, first because of their do-nothing stance on Global Accessibility Awareness Day, then their event without audio description, which Apple did perfectly, and now this article which tells us very little, and is almost a slap in the face when it talks about Windows being “delightful” because really, it’s not, and it won’t change substantially enough before release to be so.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/092918.html" 

		

Params

		map[date:2022-12-16 09:29:18 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/092918.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:29:18 -0600 -0600 layout:post linkedin:map[id:%!s(int=7009540184719433730) name:Devin Prater] mastodon:map[hostname:tweesecake.social id:%!s(int=109524065411665146) username:devinprater] medium:map[id:a6c2df67b831 username:r.d.t.prater] microblog:%!s(bool=true) post_id:%!s(int=1756111) publishdate:2022-12-16 09:29:18 -0600 -0600 type:post url:/2022/12/16/092918.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b9bc20), (*hugolib.pageOutput)(0xc000b9bd40), (*hugolib.pageOutput)(0xc000b9be60), (*hugolib.pageOutput)(0xc000ba6000), (*hugolib.pageOutput)(0xc000ba6120), (*hugolib.pageOutput)(0xc000ba6240), (*hugolib.pageOutput)(0xc000ba6360), (*hugolib.pageOutput)(0xc000ba6480)}, pageOutput:(*hugolib.pageOutput)(0xc000b9bc20), pageCommon:(*hugolib.pageCommon)(0xc0003e1900)}	
		

Digging into TalkBack’s source code

Braille

For a while now, I’ve been curious about which platform’s accessibility is, at its foundation, more secure, more “future proof”, and better able to be extended. Today, I’m looking into the TalkBack source code that is currently on Github, which I cloned just today. I’ll go through the source, to see if I can find anything interesting.

First of all, according to this file:

This whole project was started in 2015. Of course, we then have this one:

Which shows that it is copyright 2020. The first just seems to wrap Liblouis in Java, but what about this one?

Ah, it seems to be the thing that translates the table files and such into Java things. So that’s kind of where the Braille keyboard gets its back-end. Now let’s look at the front-end.

So, this was made in 2019. I do like seeing that they have been working on this stuff. Now, here, we have:

/**
/**
/**
/** Stub implementation of analytics used by the open source variant. */

Yeah, figured I wouldn’t get much out of this file.

Now here’s where we might just get something:

https://github.com/google/talkback/blob/master/brailleime\src\main\java\com\google\android\accessibility\brailleime\dialog\ContextMenuDialog.java#L4

This part was made in 2020. When they need to crank out a feature, they really get rolling. I just hope they give us a good few features this year.

Oh, now this is pretty considerate, a dialog will show if the Braille keyboard is opened and TalkBack is off:

And this one for if a device doesn’t support enough touch points (like an iPhone!):

Okay, so this next one seems to allow the braille keyboard to grab braille from a file, then turn it into something else:

This could lead to a sort of substitutions list, or spell checking or braille correction facility.

Ah, now this is a good summary of the interface. Also yeah, looks like lots of geometry.

Okay, this part is rather interesting:

/**
/**
/**
/** Reads saved points from SharedPreference. */

So, does that mean it can remember where I most put my fingers for typing?

And here’s the jackpot:

Here, we learn how Google is working around explore by Touch to give a braille keyboard that bypasses TalkBack’s own touch interaction model.

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/12/16/digging-into-talkbacks.html" 

		

Params

		map[date:2022-12-16 09:28:54 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/12/16/digging-into-talkbacks.html iscjklanguage:%!s(bool=false) lastmod:2022-12-16 09:28:54 -0600 -0600 layout:post microblog:%!s(bool=true) post_id:%!s(int=1756110) publishdate:2022-12-16 09:28:54 -0600 -0600 type:post url:/2022/12/16/digging-into-talkbacks.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000bb0120), (*hugolib.pageOutput)(0xc000bb0240), (*hugolib.pageOutput)(0xc000bb0360), (*hugolib.pageOutput)(0xc000bb0480), (*hugolib.pageOutput)(0xc000bb05a0), (*hugolib.pageOutput)(0xc000bb06c0), (*hugolib.pageOutput)(0xc000bb07e0), (*hugolib.pageOutput)(0xc000bb0900)}, pageOutput:(*hugolib.pageOutput)(0xc000bb0120), pageCommon:(*hugolib.pageCommon)(0xc0006aa500)}	
		

Braille on Android

Over the last few months, I've been focusing a lot on Braille. Much of it is because the Bluetooth earbuds I have, (Galaxy Buds Pro, Linkbuds S, Razer Hammerheads), either have poor battery life or have audio lag that's just annoying enough to make me not want to use them for regular screen reading. So, grabbing a Focus 14, I began to use Braille a lot. I've now spend a good two weeks using Android'TalkBack's new Braille support, and two weeks with VoiceOver's Braille support.

In this article, I'll overview Android's passed support for Braille, and talk about how its current support works. I'll also compare it to Apple's implementation. Then, I'll discuss how things could be better on both systems. Since there have probably been many posts on the other sites about iOS' Braille support, I don't feel like I need to write much about that, but if people want it, I can write a post from that angle as well.

BrailleBack and BRLTTY

When Google first got into making accessibility a priority of Android, back around Android 2.3, it created a few stand-alone apps. Well, they were kind of standalone. TalkBack, the screen reader, KickBack, the vibration feature for accessibility focus events, and BrailleBack, for Braille support. There may have been more, but we'll focus on BrailleBack here. BrailleBack connected to Braille displays over Bluetooth, and used drivers to communicate with them. It started out well for a first version, but wasn't updated much. In the years that followed, the biggest update was to support a new, less expensive Braille display. This has been Google's problem for a while now, having great ideas, but not giving them the attention they need to thrive. Luckily, TalkBack is still being worked on, and hasn't been killed by Google. At least now, Braille support is built in. BrailleBack wasn't even installed on phones when it was being developed, but TalkBack is. So, things may improve starting now.

BRLTTY started out as a Linux program. It connects to Braille displays using USB, Serial, or Bluetooth, and supports a huge variety of displays. It tries to give the Braille user as much control over the Linux console from the display as possible, using as many buttons as a display has. It came to Android and offered a better experience for some use cases, but the fact that you can't type in contracted Braille, a sort of shorthand that is standardized into the Braille system, may be off putting to some. Another issue is that it tries to bring the Linux console reading experience to an Android phone, which takes a bit of getting used to it.

So, here, we've got two competing apps. BRLTTY gets updated frequently, has many more commands, but has a higher bar for entry. BrailleBack is stale, supports few displays, but allows for writing in contracted Braille, and has more standardized commands. So, you'd think Deaf-Blind users would have choices, enough to use an Android phone, right?

App support matters

Let's take something that Braille excels at: reading. In Android, due to the poor support of Braille from Google up to this point, and the fact that Braille support wasn't installed, meaning that Deaf-Blind users couldn't easily set up their phones without knowing about this separate app, and having sighted assistance to install it, meant that third-party apps, like Kindle, and even first-party apps, like Google Play Books, didn't take Braille into account during development. The Kindle app, for example, just has the user double tap a button, and the system text-to-speech engine begins reading the book. The Play Books app does similar, with the option for the app to use the high quality, online Google speech engine instead of the offline one.

This is how things are today, too. In Kindle, we can now read a page of text, and swipe, on the screen, to turn the page. On Play Books, though, focus jumps around too much to even read a page of text. It's easier to just put on headphones and let the TTS read for you, so that Braille literacy, for Android users, is too frustrating to cultivate.

So, if you want to read a book on your phone, using popular apps like Kindle, you have to use the system text-to-speech engine. This means that Braille users are cut out from this experience, the one thing Braille is really good at. There are a few apps, like Speech Central, which do display the text in a scrolling web view, so that Braille users can now read anything they can import into that app, but this is a workaround that a third-party developer shouldn't have to make. This is something that Google should have had working well about five years ago.


With the release of iOS 8, 8 years ago, Apple gave Braille users the ability to “Turn Pages while Panning.” This feature allowed Braille users to read a book without having to turn the page. Even before that, unlike Android even now, Braille users could use a command on their Braille display to turn the page. Eight years ago, they no longer had to even do that.

A year later, the Library of Congress released an app called BARD Mobile, allowing blind users to access books available from the service for the blind and print disabled on their phone. Along with audio books, Braille books were available. Using a Braille display, readers could read through a book, which was just a scrolling list of Braille lines, without needing any kind of semblance of print pages. Android's version of BARD Mobile got this feature about a year ago. And now, the new Braille support doesn't support showing text in Computer Braille, which is required to show the contracted Braille of the book correctly. I'd chalk this up to a tight schedule from Google and not having been working on this for long. Perhaps version 14 of TalkBack will include this feature, allowing Braille readers to read even Braille books.

Now in Android... Braille

With the release of TalkBack 13, Braille support was finally included, finally. Beforehand, though, we got a bit of a shock when we found out that HID Braille wouldn't be supported. This, again, I can chalk up to the Braille support being very new, and the Android team responsible for Bluetooth not knowing that that's something they'll need to get implemented. Still, it sowered what could have It was a great announcement. Now, instead of supporting “all” displays, they support... “most” displays. So much for Android users being able to use their brand new NLS EReader, right? Technically, they can use it through BRLTTY, but only if it's plugged into the USB C port. Yeah, very mobile.

The Braille support does, however, have a few things going for it. For one, it's very stable. I found nothing that could slow it down. I typed as fast as I could, but never found that the driver couldn't keep up with me. Compare that to iOS, where even on a stable build, there are times where I have to coax the translation engine into finishing translating what I've written. There's also this nice feature where if you disconnect the display, speech automatically come back on. Although, now that I think about it, that may only be useful for hearing blind people, and Deaf-Blind people wouldn't even know until a sighted person told them that they now know all about that chat with the neighbor about the birthday party they were planning, and that it's no longer a surprise. Ah Well, so much for the customizability of Android. In contrast, when speech is muted on iOS, it stays muted.

iOS doesn't sit still

In the years after iOS 8's release, Braille support has continued to improve. Braille users can now read Emoji, for better or worse, have their display automatically scroll forward for easier reading, and customize commands on most displays. New displays are now supported, and iOS has been future-proofed by supporting multi-line or tactile graphics displays.

iOS now also mostly supports displays that use the Braille HID standard, and work continues to be done on finishing that support. This is pretty big because the National Library service for the Blind in the US, the same that offers the BARD service, is teaming up with Humanware to provide an EReader, which while allowing one to download and read books from BARD, Bookshare, and the NFB Newsline, also allows one to connect it to their phone or PC, to be used as a Braille display. This means, effectively, that whoever wants Braille, can get Braille. The program is still in its pilot phase, but will be launched sooner or later. And Apple will be ready.


No, Android doesn't support these new displays that use the Braille HID standard. It also doesn't support multi-line or graphics displays, nor does it support showing math equations in the special Nemeth Braille code, nor does it support automatically scrolling the display, changing Braille commands, and so on. You may then say “Well, this is just version one of a new Braille support. They've not had time to make all that.” A part of that is true. It is version one of their new Braille subsystems of TalkBack. But they've had the same amount of time to build out both Braille support, and TalkBack as a whole, that Apple has. In fact, they've had the same eight years since iOS 8 to both learn from using Apple's accessibility tools, and to implement them themselves.

So, let's say that Google has begun seriously working on TalkBack for the last 3 years, since new management has taken the wheel and, thankfully, steered it well. Google now may have to take at least 4 years to catch up to where Apple is now. Apple, however, isn't sitting still. They've put AI into their screen reader years before the AI-first company, Google, did. How much longer will it take Google to add things like auto-scroll to their screen reader to serve an even smaller subset of their small data pool of blind users?

Neither system is perfect

While Apple's Braille support is fantastic, is is only rather rusty with age, both systems could be using Braille a bit better to really show off why Braille is better than just having a voice read everything to you. One example that I keep coming back to its formatting. For example, a Braille user won't be able to tell what type of formatting I'm using here on either system, even though there are formatting symbols for what I just used in Braille. And no, status cells don't count, they can't tell a reader what part of a line was formatted, and the “format markers” used in Humanware displays are a lazy way of getting around... I don't even know what. If BrailleBlaster, using LibLouis and its supporting libraries and such, can show formatting just fine, I don't see why screen readers in expensive phones can't.

Both systems could really take a page out of the early Braille NoteTakers. The BrailleNote Apex not only showed formatting, but showed things like links by enclosing them in Braille symbols, meaning that not only could a user tell where the link started and ended, just line sighted people, they could do so in a way that needed no abbreviated word based on speech. BRLTTY on Android shows switches and checkboxes in a similar way, using Braille to build a nonvisual, tactile interface that uses Braille pictograms, for lack of a better term, to make screen reading a more delightful, interesting experience, while also shortening the Braille needed to comprehend what the interface item is. This kind of stuff isn't done by anyone besides people who really understand Braille, read Braille, and want Braille to be as efficient, but enjoyable, as possible.

Another thing both companies should be doing is testing Braille rigorously. There is no reason why Braille users shouldn't be able to read a book, from start to end, using Google Play Books. There's also no reason why notifications should continue to appear on the display when they were just cleared. Of course, one issue is much more important than the other, but small issues do add up, and if not fixed, can drag down the experience. I really hope that, in the future, companies can show as much appreciation for Braille as they do for visuals, audio, haptics, and even screen readers.

Until then, I'll probably use iOS for Braille, image descriptions, and an overall smoothe experience, and use Android for its raw power, and honestly better TTS engine, well if you have Samsung that is. With the ability to customize Braille commands, iOS has given me an almost computer-like experience when browsing the web. Android has some of that, but not the ability to customize it.

Conclusion

I hope you all have enjoyed this article, and learned something from my diving into the Braille side of both systems. If so, be sure to share it with your friends or coworkers, and remember that speech isn't the only output modality that blind, and especially Deaf-Blind, people use. As Apple says on their accessibility site, Braille is required for Deaf-Blind users. Thank you all for reading.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/10/13/braille-on-android.html" 

		

Params

		map[date:2022-10-13 05:07:15 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/10/13/braille-on-android.html iscjklanguage:%!s(bool=false) lastmod:2022-10-13 05:07:15 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756071) publishdate:2022-10-13 05:07:15 -0600 -0600 title:Braille on Android type:post url:/2022/10/13/braille-on-android.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b9b320), (*hugolib.pageOutput)(0xc000b9b440), (*hugolib.pageOutput)(0xc000b9b560), (*hugolib.pageOutput)(0xc000b9b680), (*hugolib.pageOutput)(0xc000b9b7a0), (*hugolib.pageOutput)(0xc000b9b8c0), (*hugolib.pageOutput)(0xc000b9b9e0), (*hugolib.pageOutput)(0xc000b9bb00)}, pageOutput:(*hugolib.pageOutput)(0xc000b9b320), pageCommon:(*hugolib.pageCommon)(0xc000827900)}	
		

The Blind Android

This post is to reflect on what can be gained by using Android, as opposed to iOS. My previous post, My Dear Android talked a little about this, but I wanted to go into further detail here.

USB-C has won

I have a lot of accessories, for computers and phones. I have a game controller, which uses Micro USB, but if you buy it now, it'll likely come with USB-C. I have a game controller for a phone, which uses USB-C. I have a Braille display, which uses USB-C. In fact, I'd say just about every modern Braille display uses USB-C. I have USB-C earbuds. All of these technologies use USB-C or can be made to use it with a dongle.

When I use an iPhone, any iPhone today, I have to put all these accessories through a dongle. I don't have a USB-C to Lightning dongle yet, but I do have a Lightning to USB A one. So, whenever I want to plug in, say, a USB-C Flash drive, I can't. I can't plug in my USB-C earbuds into the iPhone. Now, are their dongles for this? Sure. But why deal with that. USB-C has won, soundly, over Lightning. Lightning was always going to be a closed, Apple-only system. No one likes non-standard junk.

Audio and standards

As mentioned in another article, I have a pair of Sony LinkBuds S. These are a pair of truly wireless earbuds that have noise-canceling, transparency mode, integration with Google Assistant and Alexa, integrate with Spotify and Endel, and sound fantastic. When I used them with my iPhone, which has Bluetooth 5.0 (which the newest iPhone SE 2022 also has), the lag was just too much to deal with. When I use them with Android, the lag is noticeable, yes, but much less, and much easier to deal with. This really pushed me back to Android. With iPhone, I would have to get all Apple products to have the best experience. I would need to get the new AirPods or AirPods Pro. I would need an Apple Watch. I would need a Mac. With Android, interoperability means I can get any Android-supported accessories, and they would work just fine.

Another difference between the two ecosystems is that Google Assistant readily works with the LinkBuds S. Assistant reads incoming messages, reads notifications, and does just about everything one can do with the Pixel Buds pro. On iPhone, there is no way to get Siri to automatically read new notification unless you have a pair of AirPods. Seeing this, Android works with many more accessory types, not just in a basic way, but supporting them to their fullest potential.

Also, did I mention the Bluetooth codecs? In Android, several phones have 3 or more different codecs, to support the widest range of audio types. On iPhone, there's just SBC, the lowest quality codec that must be supported, and AAC, Apple's own codec. No APTX, no LDAC, no LC3. So, even if you get an expensive pair of headphones that supports APTX low latency audio, you won't get that support on an iPhone. To be fair, some Android phones don't support APTX either, but on Android, you have that choice of phones. On iPhone, you don't.

Works with Windows and Chromebooks

If you use your computer a lot, you may want to text with it. If you're blind, chances are that you have a Windows PC. Well, iPhone works exclusively with Mac computers, so you can't text from your PC, or make calls from your PC, or control your phone from your PC. Oh yeah, you can't control your iPhone from your Mac either. Anyway, you can do all this from a Windows computer. If you use Google Messages, you can even read and send texts from the web, using your phone and phone number. As an added bonus, the Messages for Web page is very accessible, and has keyboard commands for navigating to the conversations list or messages list.

This gives me the freedom to do what I want, from whatever device I'm on. I don't have to switch contexts from my computer, to my phone, just to send a text, or read a text. I can just open Messages for Web, and do everything there.

Are you a Developer?

How much do you think you'll save if you didn't have to pay $100 per year? That's how much it costs to have an app on the Apple App Store. If you're a blind developer, you may be paying for JAWS every year too, so that's $200 a year, just to make great apps for iPhone. Along with all that, you have to deal with the sometimes frustrating experience of not only using a Mac, but developing on it, in Xcode. Now, you may be using a framework like React Native, or Beeware for Python, where you don't have to code in Swift, or touch Xcode all that much. If so, that probably cuts down on a lot of stress. But you still have to spend money just to keep your app on the App Store.

On Android, all a developer needs to do is pay $25, once. That's it. There is the 15% service fee on in-app purchases, but if your app is free, you don't have to worry about any of that. Also, you aren't limited to one language. You can use Java, Kotlin, some C++, C#, Python, JavaScript, Dart, and Corona (Lua). Of course, a few of these, like JavaScript (React Native and such) can be used to create iPhone apps too. But with Android, you can use your superior Windows platform, VS Code, and NVDA or JAWS to develop Android apps easily. Also, Android Studio is accessible on Windows too.

Accessibility, the mixed bag

Now we get into the thing I'm all about, accessibility. If you use apps like Telegram, DoorDash, Messenger, YouTube, and others, you may find that they don't work as well as they should on iPhones. YouTube, just recently, gained a bug where you can't go passed the third or so item in the home tab. Android doesn't have that problem. DoorDash has reviews in the middle of their menus, and tells you the time the delivery will reach you, not the estimated time in minutes as it does on Android. In Telegram on the iPhone, if you have a message that covers more than the screen height, VoiceOver will not navigate to the next message until you scroll forward. On Android, TalkBack will eventually reach the next message, and will not get stuck.

This shows, to me at least, a slice of something strange. Android seems to have more of a flexible accessibility framework, allowing for code to tell more of the story than visuals. On iPhone, VoiceOver doesn't look passed the current screen of content, or the cross-platform framework doesn't tell VoiceOver about it, but does tell Android and lets TalkBack navigate to it. However the code works, it results in a worse experience on iPhone, and a better one on Android. I can't argue with results.

Now, for image descriptions. I do miss them, being on Android. But, I'm sure Google is working on them, with its testers. After all, TalkBack can describe text, and icons now. And it does that very well. So, I'm sure they'll get image descriptions down in maybe a year. In the meantime, I still have an iPhone, Lookout, Bixby Vision, and Envision to hold me over until then.

I'm also hoping Google works on audible graphs, as that's pretty helpful. I could see them integrating that with image descriptions to describe graphical graphs, which iOS doesn't do yet.

Now, for Braille, things have improved. I grabbed a Focus 14 to work with, and find that I can use my phone with Braille support for about 30 minutes, without growing tired of it. One really nice thing that TalkBack does is focus management. So, if you leave an app, then come back to it, focus will remain at the spot that you left it. So, if you're reading a Reddit thread in RedReader, and you go to Messenger to read and reply to a message, when you come back to Redreader, your focus will be on the exact comment you left it on. I don't recall that ever happening on the iPhone.

Mostly, it's a very good start of Android's new Braille implementation. One that, even though it's new, is very stable, and all commands work fine. There isn't the issue of HID-based displays on iOS, where you cannot assign the “enable autoscroll” command and such. Input works great, and there is no time when the input process gets stuck, and you have to press the “translate” command several times to plunge it out.

Conclusion

After spending a week with iPhone, I'm back on Android. Yes, I'm looking forward to greater accessibility, like image descriptions, Braille improvements, and audible graphs and charts, but I also love what Android is right now. Android is open, allows for greater innovation by developers, allows accessory manufacturers to create great, integrated experiences, and in quite a few cases, is even more accessible than the iPhone.

Android also allows one to use many of its services from a Windows computer, which is more popular in the blind community than Macs. This allows the user to stay in the same context, without needing to pull out a phone just to check a text. One can also make calls and control their Android phone from a PC.

In closing, thanks for reading this article, in my journey with Android and iPhone. I know I'm not done with this, and as the two operating systems grow and age, things will change, on either Android or iOS' side. Feel free to subscribe to my blog, or leave comments.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/09/10/the-blind-android.html" 

		

Params

		map[date:2022-09-10 06:18:17 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/09/10/the-blind-android.html iscjklanguage:%!s(bool=false) lastmod:2022-09-10 06:18:17 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756070) publishdate:2022-09-10 06:18:17 -0600 -0600 title:The Blind Android type:post url:/2022/09/10/the-blind-android.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b9aa20), (*hugolib.pageOutput)(0xc000b9ab40), (*hugolib.pageOutput)(0xc000b9ac60), (*hugolib.pageOutput)(0xc000b9ad80), (*hugolib.pageOutput)(0xc000b9aea0), (*hugolib.pageOutput)(0xc000b9afc0), (*hugolib.pageOutput)(0xc000b9b0e0), (*hugolib.pageOutput)(0xc000b9b200)}, pageOutput:(*hugolib.pageOutput)(0xc000b9aa20), pageCommon:(*hugolib.pageCommon)(0xc000827400)}	
		

My Dear Android

Last night, I turned on my Galaxy S20 FE (5G) again. To update apps, and compare a week away with the iPhone to how Android feels now. And, I must say, Android is still charming to me.

Telegram works better than on iOS. On iOS, VoiceOver gets stuck on a long message, not moving to the next message at all until you scroll forward. I've not tried Whatsapp yet, but I wouldn't be surprised if it worked better there too. Also, Doordash is a lot easier to use on Android, without all the reviews and junk getting in my way like on iOS.

But the big thing was my earbuds. I have a pair of Sony LinkBuds S, which sound great, work with either Google Assistant or Alexa directly, and not through the framework where you hold down the home button and use that sort of voice control interface, and has all the good stuff like noise canceling and transparency mode. That can also be changed through Google Assistant.

So, I can use it with my iPhone X R. It works pretty well, and I can use Alexa through it. But, the latency is awful, and it took a few tries before I could get it set up. On Android, though, the latency is mild enough to where I can deal with it, and setup was quick and easy. This is a symptom of Apple's issue of wanting control. I don't have the Airpods. I don't have the AirPods Pro. I do have the Sony LinkBuds S, which probably blow all AirPods out of the water with it's pretty literally chest-thumping base (at least for me and my hearing). The AirPods Pro, first generation, didn't have that. I have little hope that the second generation, or the regular AirPods third generation (that can get confusing really fast), would have that. Plus, there's one chord to rule them all.

That's right, USB C. I love it! It's everywhere, used on just about everything, and I can connect my phone to my dock at work and use it with a keyboard and wired headphones at work. Speaking of wired headphones, there are actually USB C headphones. There aren't many Lightning headphones. Yes, I can get Apple's wired headphones. But that's $30. What if I want a pair of $250 cans I can rock out to?

Lastly, TalkBack is on a good path. They've added basic Braille support, which they'll hopefully be improving throughout the coming year, Android 13 has Audio Description API's, and hopefully in the next update to TalkBack, image descriptions so I can see my cat that I had to give up recently. Poor little Ollie. On the iPhone, while image descriptions are bright and vibrant, Braille is starting to suffer a good many pesky bugs that make me not even want to use it. Maybe one or two will be fixed when iOS 16 is released, but they've got a week to do it, and I don't see them spending that much time on a minority of a minority. However, TalkBack's Braille support, while new, is pretty solid, a good base to work upon.

So, I wanted to post this to balance things out from my other post when I went from Android to Apple. My journey is definitely not over, and neither are the two operating systems in question. While we know what iOS 16 brings, new voices, door detection, and probably other stuff, the TalkBack team has been pretty tight-lipped on what they've been working on. I miss the days when we had more open dialog with them. But at least they have a blind person on the team that does interact with the community some.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/09/09/my-dear-android.html" 

		

Params

		map[date:2022-09-09 05:27:35 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/09/09/my-dear-android.html iscjklanguage:%!s(bool=false) lastmod:2022-09-09 05:27:35 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756069) publishdate:2022-09-09 05:27:35 -0600 -0600 title:My Dear Android type:post url:/2022/09/09/my-dear-android.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b9a120), (*hugolib.pageOutput)(0xc000b9a240), (*hugolib.pageOutput)(0xc000b9a360), (*hugolib.pageOutput)(0xc000b9a480), (*hugolib.pageOutput)(0xc000b9a5a0), (*hugolib.pageOutput)(0xc000b9a6c0), (*hugolib.pageOutput)(0xc000b9a7e0), (*hugolib.pageOutput)(0xc000b9a900)}, pageOutput:(*hugolib.pageOutput)(0xc000b9a120), pageCommon:(*hugolib.pageCommon)(0xc000826f00)}	
		

Six Months with Android: A journey from a blind perspective

During the last six months, I’ve used Android exclusively as my primary device. I made calls with it, texted with it, read emails with it, browsed the web, Reddit, Twitter, and Facebook with it, and played games on it. Now, I’m going to express my thoughts on it, its advancements, and its issues.

This will contain mostly opinions, or entirely opinions, depending on whether you really love Android or not. But whatever your stance, these are my experiences with the operating system. My issues may not be your issues, and so on.

Daily Usage

To put things into perspective, I’ve used my phone, the Samsung Galaxy S20 FE 5G, for the following, with the apps I’ve used:

  • Email: Gmail, Aquamail, DeltaChat
  • Messaging: Google Messages
  • RSS: Feeder
  • Podcasts: Podcast Addict, PocketCasts
  • Terminal: Termux
  • HomeScreen: 1UI
  • Screen reader: Google TalkBack
  • Speech Engine: Samsung TTS and Speech Services by Google
  • Text Recognition/object detection: Google Lookout and Envision AI
  • Gaming: PPSSPP, Dolphin, EtherSX2
  • Reddit: Redreader

I’m sure I’m forgetting a few apps, but that’s basically what I used most often. For Facebook, Twitter, YouTube and other popular services, I used their default apps, with no modifications. I used all the Google services that I could, and rarely used Samsung’s apps. So, this is to show that I was deep into the Android ecosystem, with Galaxy Buds Pro, a ChromeBook, and a TickWatch E3.

The good

I want to start off the comparison with what worked well. First, Samsung TTS voices are really nice, sounding even more smooth, sometimes, than Alex on iOS, and much more so than the Siri voices. I still love the Lisa voice, which, to me, sounds as close to Alex as possible with her cadence and professional-sounding tone. Yes, the voices could be sluggish if fed lots of text at once, but I rarely ran into that.

I also love the wide variety of choice. Apple includes the AAC Bluetooth codec on their iPhones. So, if you get APTX, or Samsung’s Variable codec, or other headphones with other codecs, it won’t matter, and you’ll go back to SBC, which sounds the worst out of all of them. If your headphones have AAC, of course, it’ll get used on the iPhone. But if not, you’re stuck with SBC. Android phones, though, usually come with a few different codecs for headphones to choose from, and in the developer settings, you can choose the codec to use.

Another great feature of all modern Android phones is USB-C. Everything else uses USB-C now, including Braille displays, computers, watches, and even iPads. With Android, you can plug all these things into your phone with the same cable. If your flash drive has USB-C, you can even plug that in! With iPhone, though, you have to deal with Lightning, which is just another cable, and one you’ll likely have less of, since less stuff uses it.

The last thing is that Android phone makers typically try out new technology before Apple does, leading to bigger cameras, folding phones, or faster Wi-fi or cellular data. Now that the new iPhone SE has 5G, and probably the latest Wi-fi, though, that’s most likely less of an issue. Still, if you like folding phones, Android is your only choice right now.

Starting on the software, it’s pretty close to Linux, so if you plug in a game controller, keyboard, or other accessory, it’ll probably work with it. If you have an app for playing music using a Midi keyboard, and you plug one in, it’ll likely work. On iPhone, though, you need apps for more things, like headphones and such.

Another nice thing, beginning in the accessibility arena, is that the interface is simple. Buttons are on the screen at all times, not hidden behind menus or long-press options like they are a lot of the time on iOS. If you can feel around the screen with one finger, or swipe, you’ll find everything there is to find. This is really useful for beginners.

Another pleasant feature is the tutorial. The TalkBack tutorial guides Android users through just about every feature they’ll need, and then shows them where they can learn more. VoiceOver has nothing like that.

On Android, things are a lot more open. Thanks to that, we have the ability for multiple screen readers, or entirely new accessibility tools, to be made for Android. This allows BRLTTY to add, at least, USB support for HID Braille displays, and Commentary to OCR the screen and display it in a window. This is one of the things that really shines on Android.

The bad

Those Bluetooth headphones I was talking about? The Galaxy Buds Pro are very unresponsive with TalkBack, making them almost useless for daily use. The TickWatch has its own health apps, so it doesn’t always sync with Google Fit, and doesn’t sync at all with Samsung Health. Otherwise, the watch is a nice one for Android users. On iPhone though, it doesn’t even share the health data with the health app, just Google Fit, which doesn’t sync with the health app either.

A few days ago, a few things happened that brought the entire Android experience into focus for me. I was using the new Braille support built into TalkBack, with an older Orbit Reader Braille display, since my NLS EReader doesn’t work with TalkBack, since there is no Bluetooth Braille HID support. I found that reading using the Google Play Books app is really not a great experience when reading in Braille. Then, I found a workaround, which I’ll talk about soon, but the fact is that it’s not a great experience on Google Play Books.

So, I got confirmation that someone else can reproduce the issue. The issue is that the display is put back at the beginning of the page before even reading the next page button, and that one then cannot easily move to the next page. I then contacted Google Disability support. Since their email address was no longer their preferred means of contact, I used their form. On Twitter, they always just refer you to the form.

The form was messy. With every letter typed into the form, my screen reader, NVDA on Windows, repeated the character count, and other information. It’s like no blind person has ever tested the form that blind people are going to use to submit issues. “No matter,” I thought. I just turned my speech off and continued typing, in silence.

When the support person emailed me back, I was asked some general info, and to take a video of the issue. This would require, for me, a good bit of setup. I’d need three devices: the phone, the Braille display, and something to take the video with, probably a laptop. Then I’d need to get everything in the frame, show the bug, and hope the support person can read Braille enough to verify it.

This was a bit much for me. I have a hard job, and I have little energy afterwards. I can’t just pop open a camera app and go. So, I asked around, and found a workaround. If you use the Speech Central app, you can stop the speech, touch the middle of the screen, and then read continuously. But why?


This really brought home to me the issues of Android. It’s not a very well put together system. The Google Play Books team still uses the system speech engine, not TalkBack, to read books. The Kindle app does the same thing. There is barely a choice, since TalkBack reads the books so poorly. This is Google’s operating system, Google’s screen reader, and Google’s book-reading app. There is little excuse for them to not work well together.

Then, either that night or the night after that, I got a message on Messenger. It was a photo message. So, naturally, I shared it with Lookout, by Pressing Share, then finding Lookout in the long list of apps, double tapping, waiting for the image recognition, and reading the results. And then I grabbed the iPhone, opened Messenger, opened the conversation, double-tapped the photo, put focus on the photo, and heard a good description. And I thought, “Why am I depriving myself of better accessibility?”

And there’s the big issue. On iOS, Braille works well, supports the NLS EReader, and even allows you to customize commands, on touch, Braille, and keyboard. Well, there are still bugs in the HID Braille implementation that I’ve reported, but at least the defaults work. That’s more than I can say for TalkBack and Android.

And then the big thing, image descriptions, and by extension, screen recognition. TalkBack finally has text detection, and icon detection. That’s pretty nice. But why has it taken this long? Why has it taken this long to add Braille support? Why do we still have robotic Google TTS voices when we use TalkBack? After all these years, with Google’s AI and knowledge, Android should be high above iOS on that front. And maybe, one day, it will be. But right now, Android’s accessibility team is reacting to what Apple has done. Braille, image descriptions, all that. And if there’s a central point to what I’ve learned, it’s this: do not buy a product based on what it could be, but what it currently is.

Then, I started using the iPhone more, noticing the really enjoyable, little things. The different vibration effects for different VoiceOver actions. Not just one for the “gesture begin”, “gesture end,” “interactable object reached”, and “text object reached”. No, there are haptics for alerts, reaching the boundary of the screen or text field, moving in a text field, using Face ID, and even turning the rotor. And you can turn each of these on or off. What’s that about Android being so customizable?

Then there’s the onscreen Braille keyboard. On Android, to calibrate the dots, you hold down all six fingers, hold it a little longer, just a bit longer… Ah, good, it detected it this time. Okay, now hold down for two more seconds. Now you’re ready to type! Yes, it takes just about that long.

On iOS, you quickly tap the three fingers of your left hand, then the fingers of your right hand, and you’re ready! Fast, isn’t it? These kinds of things were jarring with their simplicity, coming from Android, where I wasn’t even sure if calibration would work this time. I do miss the way typing on the Android Braille keyboard would vibrate the phone, letting you know that you’d actually entered that character. However, the iPhone’s version is good enough that I usually don’t have to worry about that.

I want to talk a bit more about image descriptions. While I was on Android, I learned to ignore images. Sure, I wanted to know what they were, but I couldn’t easily get that info, not in like a few seconds, so I left them alone. On iOS, it’s like a window was opened to me. Sure, it’s not as clear as actually having sight, and yes, it gets things wrong occasionally. But it’s there, and it works enough that I love using it. Now, I go on Reddit just to find more pictures!

And for the last thing, audio charts. Google has nothing like this. They try to narrate the charts, but it’s nothing like hearing the chart, and realizing things about it yourself. Hearing the chart is also much faster than hearing your phone reading out numbers and labels and such.

The ugly

Here, I’ll detail some ugly accessibility issues on Android, that really make iOS look as smooth as glass in comparison. Some people may not deal with this, but I did. Maybe, by the time you read this article, they’ll be fixed in Android 13, or a TalkBack update or something.

First, text objects can’t be too long, or TalkBack struggles to move on to the next one. This can be seen best in the Feeder app, which, for accessibility reasons, uses the Android native text view for articles. This is nice, unless a section of an article spans one screen of text. Take the Rhythm of War rereads on Tor. Some of those sections are pretty long, and it’s all in one text element. So TalkBack will speak that element as you swipe to the next element, until it finally reaches the next one. This can take one swipe, or three, or five. This happens a lot in Telegram too, where messages can be quite long.

Another issue is clutter. A lot of the time, every button a user needs is on the screen. For example, the YouTube app has the “Go to channel” and “actions” buttons beside every video. This means you have to swipe three times per video. On iOS, each video is on one item, and the actions are in the actions rotor. TalkBack has an actions menu, but apps rarely use it. Gmail does, for example, but YouTube doesn’t. This makes it even more tricky for beginners, who would then have to remember which app uses it and which app doesn’t, and how to get to it and such.

When an Android phone wakes up, it reads the lock status, which usually is something like “Swipe with two fingers or use your fingerprint to unlock.” Then, it may read the time. That’s a lot of words just to check what time it is. Usually dependably, an iPhone reads the time, and then the number of notifications. Apple’s approach is a breath of fresh air, laced with the scent of common sense and testing by actual blind people. This may seem like a small thing, but those seconds listening to something you’ve heard a hundred times before add up.

If you buy a Pixel, you get Google TTS as your speech engine. It sucks pretty badly. They’re improving it, but TalkBack can’t use the improvements yet, even if other screen readers and TTS apps can. Crazy, right? However, with the Pixel, you get Google’s Android, software updates right at launch, and the new tensor processor, voice typing, and so on. If you get Samsung, you get a good TTS engine, for English and Korean at least. You also get a pretty good set of addons to Android, but a six-month-old screen reader and an OS that won’t be updated in about six months either. This is pretty bad mostly because of TalkBack. You see, there are two main versions of TalkBack. There is Google’s version, and Samsung’s version. Samsung’s TalkBack is practically the same as Google’s, but at least one major version behind, all the time. With iPhone, you get voices aplenty, from Eloquence—starting next month—Alex, Siri voices, and Vocalizer voices, with rumors that third-party TTS engines can be put on the store soon. You get a phone that, with phones as old as the 8, can get the latest version of iOS, and get them on the day they’re released. And there is no older version of VoiceOver just floating around out there.

Further thoughts

I still love Android. I love what it stands for in mobile computing. A nice, open source, flexible operating system that can be customized by phone makers, carriers, and users to be whatever is needed. But there really isn’t that kind of drive for accessibility. TalkBack languished for years and years, and is only just now trying to hurry to catch up to VoiceOver. Will they succeed? Maybe. However, VoiceOver isn’t going to sit still either. They now have the voice that many in the blind community can’t do without. On Android, that voice, Eloquence, is now abandoned, and can’t be bought by new users. And when Android goes 64-bit only, who knows whether Eloquence will work or not. iOS, on the other hand, officially supports Eloquence, and the vocalizer voices, and even the novelty voices for the Mac. They won’t be abandoned just because a company can’t find it within themselves to maintain such a niche product. Furthermore, all these voices are free. Of course, when a blind person buys an $800 phone, they’d better be free.

I’m also not saying iOS is perfect. There are bugs in VoiceOver, iOS, and the accessibility stack. Braille in particular suffers from some bugs. But nothing is bug-free. And no accessibility department will be big, well-staffed, well-funded, or well-appreciated. That’s how it is everywhere. The CEO or president or whoever is up top will thank you for such a great job, but when you need more staff, better tools, or just want appreciation, you’ll often be gently, but firmly, declined. Of course, the smaller the company, the less that may happen, but the disability community can never yell louder than everyone else. Suddenly, the money, the trillions, or billions of dollars, just isn’t there anymore when people with disabilities kindly ask.

But, the difference I see is what the two accessibility teams focus on. Apple focuses on an overall experience. When they added haptics to VoiceOver, they didn’t just make a few for extra feedback, they added plenty, for a feedback experience that can even be used in place of VoiceOver’s sounds. When they added them, they used the full force of the iPhone’s impressive haptic motor. Just feel that thump when you reach the boundary of the screen, or the playful tip tick of an alert popping up, or the short bumps as you feel around an empty screen. All that gives the iPhone more life, more expression in a world of boring speech and even more boring plain Braille.

The iPhone has also been tested in the field, for even professional work like writing a book. One person wrote an entire book on his iPhone, about how a blind person can use the iPhone. That is what I look for in a product, that level of possibility and productivity. As far as I know, a blind person has never written a book using Android, preferring the “much more powerful” computer. I must say that the chips powering today's phones are sometimes even more powerful than laptop chips, especially from older laptops. No, it’s the interface, and how TalkBack presents it, that gets in the way of productivity.

Lastly, I’m not saying that a blind person cannot use Android. There are hundreds of blind people that use Android, and love it. But if you rely on Braille, or love image descriptions, or the nice, integrated system of iOS, you may find Android less productive. If you don’t rely on these things, and don’t use your phone for too much, then Android may be a cheaper, and easier, option for you. I encourage everyone to try both operating systems out, on a well-supported phone, for themselves. I’ll probably keep my Android phone, since I never know when a student will come in with one. But I most likely won’t be using it that much. After all, iOS and VoiceOver, offer so much more.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/08/28/six-months-with.html" 

		

Params

		map[date:2022-08-28 00:52:59 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/08/28/six-months-with.html iscjklanguage:%!s(bool=false) lastmod:2022-08-28 00:52:59 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756068) publishdate:2022-08-28 00:52:59 -0600 -0600 title:Six Months with Android: A journey from a blind perspective type:post url:/2022/08/28/six-months-with.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b957a0), (*hugolib.pageOutput)(0xc000b958c0), (*hugolib.pageOutput)(0xc000b959e0), (*hugolib.pageOutput)(0xc000b95b00), (*hugolib.pageOutput)(0xc000b95c20), (*hugolib.pageOutput)(0xc000b95d40), (*hugolib.pageOutput)(0xc000b95e60), (*hugolib.pageOutput)(0xc000b9a000)}, pageOutput:(*hugolib.pageOutput)(0xc000b957a0), pageCommon:(*hugolib.pageCommon)(0xc000826a00)}	
		

Looking for a Digital Home

This is going to be a more emotional post, which mirrors my mental state as of now. I just have to write this down somewhere, and my blog should be a good place to put it. It may also be helpful for others who struggle with this.

I've used just about every operating system out there, from Windows to Mac to Linux, ChromeOS, and Android and iOS. I've still not found one I can be completely happy with. I know, I may never find an OS that fits me perfectly, but so many others have found Linux to be all they ever need. I wish I could find that. Feel that feeling of not needing to switch to another laptop just to use a good Terminal with a screen reader that will always read output, or the ability to use Linux GUI apps, like GPodder or Emacs with Emacspeak.

There are times when Windows is great. Reading on the web, games, and programs that were made by blind people to help with Twitter, telegram, Braille embossing, and countless screen reader scripts. Other times, I want a power-user setting. I want GPodder for podcasts, or to explore Linux command-line apps. I asked the Microsoft accessibility folks about Linux GUI accessibility, and they just said to use Orca. I've never gotten Orca to run reliably on WSL2. It's always been reliable on ChromeOS with Crostini.

Whenever I get enough money, I'll get 16 GB RAM, so maybe I can run a Linux VM. But still, that's not bare metal. And if I switch to Linux, I would have to run a Windows VM, for the few things that run better on Windows, like some games, and probably the Telegram and Twitter support. It's all just kind of hard to have both. Dual booting may work, but I've also heard that Windows gets greedy and messes with the bootloader.

But, with there being a blind person working on Linux accessibility at Red hat, I hope that, soon, I won't need Windows anymore. I can hope, at least. But with there still being a few who have the mindset that I must fix everything myself, I must still remain cautious, and unexcited about this development among the hardcore Linux community, lest the little amount of joy a full-time Linux accessibility person being hired gives me, is taken away by their inflexibility and cold, overly-logical mindset.

But, I'm not done yet. With the little energy taking vitamins has given me, I've made a community for FOSS accessibility people on Matrix, bridged to IRC. I continue to study books on Linux, although I've not gotten up the energy to continue learning to program and practice. Maybe I'll try that today.

Mostly, I don't want newcomers to Linux to feel as alone in their wrestling with all this as I do. All other blind people are already so far ahead. Running Arch Linux, able to code, or at least happy with what they have and use. I don't want future technically inclined blind people to feel so alone. Kids who are just learning to code, who are just getting into GitHub, who are just now learning about open source. And they're like “so what about a whole open source operating system?”

And then they look, find Linux, and find so few resources for it for them. Nothing that they can identify with. Well shoot, there it is. Documentation I guess. I do want to wait until Linux, and Gnome or whatever we ultimately land on, is better. Marco (in Mate), shouldn't be confused whenever a QT or Elektron-based app closes and focus is left out in space somewhere. An update shouldn't break Elektron apps' ability to show a web view to Orca. And we definitely shouldn't be teaching kids a programming language, Quorum, made pretty much specifically for blind people. But I'm glad we're progressing. Slowly, yes, but it's happening at least.

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/07/02/looking-for-a.html" 

		

Params

		map[date:2022-07-02 11:23:08 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/07/02/looking-for-a.html iscjklanguage:%!s(bool=false) lastmod:2022-07-02 11:23:08 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756067) publishdate:2022-07-02 11:23:08 -0600 -0600 title:Looking for a Digital Home type:post url:/2022/07/02/looking-for-a.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b94ea0), (*hugolib.pageOutput)(0xc000b94fc0), (*hugolib.pageOutput)(0xc000b950e0), (*hugolib.pageOutput)(0xc000b95200), (*hugolib.pageOutput)(0xc000b95320), (*hugolib.pageOutput)(0xc000b95440), (*hugolib.pageOutput)(0xc000b95560), (*hugolib.pageOutput)(0xc000b95680)}, pageOutput:(*hugolib.pageOutput)(0xc000b94ea0), pageCommon:(*hugolib.pageCommon)(0xc000826500)}	
		

By the Blind, For the Blind

Why tools made by the blind, are the best for the blind.

Introduction

For the past few hundred years, blind people have been creating amazing technology and ways of dealing with the primarily sighted world. From Braille to screen readers to canes and training guide dogs, we've often shown that if we work together as a community, as a culture, we can create things that work better than what sighted people alone give to us.

In this post, I aim to celebrate what we've made, primarily through a free and open source filter. This is because, firstly, that part of what we've made is almost always overlooked and undervalued, even by us. And secondly, it fits with what I'll talk about at the end of the article.

Braille is Vital

In the 1800's, Louis Braille created a system of writing that was made up of six dots configured in two columns of three dots, which made letters. This followed the languages of print, but in a different writing form. This system, called Braille after its inventor, became the writing and reading system of the blind. Most countries, even today, use the same configurations created by Louis, but with some new symbols for each language's needs. Even Japanese Braille uses something resembling that system.

Now, Braille displays are becoming something that the 20 or 30 percent of employed blind people can afford, and something that the US government is creating a program to give to those who cannot afford one. Thus, digital Braille is becoming something that all screen reader creators, yes even Microsoft, Apple, and Google, should be heavily working with. Yet, Microsoft doesn't even support the new HID Braille standard, and neither does Google. Apple supports much of it, but not all of it. As an aside, I've not even been able to find the standards document, besides This technical notes document from the NVDA developers.

However, there is a group of people who has taken Braille seriously since 1995. That is the developers of BRLTTY, of which you can read some history. This program basically makes Braille a first-class citizen in the Linux console. It can also be controlled by other programs, like Orca, the Linux graphical interface screen reader.

BRLTTY has gone through the hands of a few amazing blind hackers (as in increddibly competent programmers)), to land at https://brltty.app, where you can download it not only for Linux, where it's original home is at, but for Windows, and even Android. BRLTTY not only supports the Braille HID standard, but is the only screen reader that supports the Canute 360, a multi-line Braille display.

BRLTTY, and its spin-off project of many Braille tables (called LibLouis), have proven so reliable and effective that they've been adopted by proprietary screen readers, like JAWS, Narrator, and VoiceOver. VoiceOver and JAWS use LibLouis, while Narrator uses them both. This proves that the open source tools that blind people create are undeniably good.

But what about printing to Braille embossers? That is important too. Digital Braille may fail to work for whatever reason, and we should never forget hardcopy Braille. Oh hey lookie! Here's a driver for the Index line of Braille embossers. The CUPS (Common Unix Printing System) program has support, through the cups-filters package, for embossers! This means that Linux, that impennitrable, unknowable system for geeks and computer experts, contains, even out of the box on some systems, support for printing directly to a Braille embosser. To be clear, not even Windows, or MacOS, or iOS, has this. Yes, Apple created CUPS, but they've not added the drivers for Braille embossers.

Let that sink in for a moment. All you have to do is set up your embosser, set the Braille code you want to emboss from, the paper size, and you're good. If you have a network printer, just put in the IP address, just like you'd do in Windows. Once that's sunk in, I have another surprise for you.

You ready? You sure? Okay then. With CUPS, you can emboss graphics on your embosser! Granted, I only have an Index D V5 to test with, but I was able to print an image of a cat, and at least recognize its cute little feet. I looked hard for a way to do this on Windows, and only found an expensive tactile graphics program. With CUPS, through the usage of connecting to other Linux programs like ImageMagick, you can get embossed images, for free. You don't even have to buy extra hardware, like embossers especially made for embossing graphics!


Through both of these examples, we see that Braille is vital. Braille isn't an afterthought. Braille isn't just a mere echo of what a screen reader speaks aloud. Braille isn't a drab, text-only deluge of whatever a sighted person thinks is not enough or too much verbiage. Braille is a finely crafted, versitile, and customizable system which the blind create, so that other blind people can be productive and happy with their tools, and thus lessen the already immense burden of living without sight in a sighted world. And if electronic Braille fails, or if one just wants to use printed material like everyone else can, that is available, and ready for use, both to print text and pictures.

Speech matters too

If a blind person isn't a fast Braille reader, was never taught Braille, or just prefers speech, then that option should not just be available for them, but be as reliable, enjoyable, and productive an experience as possible. After all, wouldn't a sighted person get the best experience possible? Free and open source tools may not sound the best, but work is being done to make screen readers as good as possible.

In the Linux console, there are three options. One can use Speakup, Fenrir, and TDSR. On the desktop, the screen reader has been Orca, but another is being written, called Odilia. Odilia is being written by two blind people, in the Rust programming language.

If one uses the Emacs text editor, one can also take advantage of Emacspeak. This takes information not from accessibility interfaces, but Emacs itself, so it can provide things like aural syntax highlighting, or showing bold and italics through changes in speech.

Community

There are several communities for blind Linux and open source users. There is the Blinux, the Orca mailing list, the LibreOffice Accessibility mailing list, and the Debian Accessibility mailing list.

Recently, however, there is a new way for all these groups, and sighted developers, to join together with, hopefully, more blind people, more people with other disabilities, and other supporters. This is the Fossability group. This is, for now, a Git repository, mailing list, and Matrix space. It's where we can all make free and open source software, like Linux, LibreOffice, Orca, Odilia, desktop environments, and countless other projects, as useful and accessible as possible.

Blind people should own the technology they use. We should not have to grovel at the feet of sighted people, who have little to know idea what it's like to be blind, for the changes, fixes, and support we need. We should not have to wait months for big corporations (corpses), to gather their few accessibility programmers to add HID Braille support to a screen reader. We should not have to wait years for our file manager to be as responsive as the rest of the system. We should not have to wait a decade for our screen reader to get a basic tutorial, so that new users can learn how to use it. We should not have to beg for our text editor to not just support accessibility, but support choices as to how we want information conveyed. This kind of radical community support requires that blind people are able to contribute up the entire stack, from the kernel to the screen reader. And with Linux, this is entirely possible.

Now, I'm not saying that sighted people cannot be helpful, it's the exact opposite. Sighted people have designed the GUI that we all use today. Sighted people practically designed all forms of computing. Sighted developers can help because they know graphical toolkits, so can help us fix any accessibility with that. And I'm not trying to demean the ongoing, hard, thankless job of maintaining the Orca screen reader. Again, that's not even the maintainer's job that she gets paid for. However, I do think that if more blind people start using and contributing to Linux and other FOSS projects, even with just support or bug reports, a lot of work will be lifted from sighted people's shoulders.

So, let's own our technologies. Let's take back our digital sovereignty! We should be building our own futures, not huge companies with overworked, underpaid and underappreciated, burnt-out and understaffed accessibility engineers. Because while they work on proprietary, closed-off, non-standard solutions, we can build on the shoulders of the giants that have gone before us, like BRLTTY, the CUPS embosser drivers, and so many other projects by the blind, for the blind. And with that, we can make the future of Assistive Technology open, inviting, welcoming, and free!

Discuss...

You can always subscribe to my posts through Email or Mastodon. Have a great day, and thanks for reading!

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc000701110)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc000157d90), s:(*hugolib.Site)(0xc000698fc0), language:(*langs.Language)(0xc000701110), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1701011762", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2022/06/22/by-the-blind.html" 

		

Params

		map[date:2022-06-22 16:58:37 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2022/06/22/by-the-blind.html iscjklanguage:%!s(bool=false) lastmod:2022-06-22 16:58:37 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=1756066) publishdate:2022-06-22 16:58:37 -0600 -0600 title:By the Blind, For the Blind type:post url:/2022/06/22/by-the-blind.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000b945a0), (*hugolib.pageOutput)(0xc000b946c0), (*hugolib.pageOutput)(0xc000b947e0), (*hugolib.pageOutput)(0xc000b94900), (*hugolib.pageOutput)(0xc000b94a20), (*hugolib.pageOutput)(0xc000b94b40), (*hugolib.pageOutput)(0xc000b94c60), (*hugolib.pageOutput)(0xc000b94d80)}, pageOutput:(*hugolib.pageOutput)(0xc000b945a0), pageCommon:(*hugolib.pageCommon)(0xc0003faf00)}	
		
Home Page Debug Information &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c32d80), (*hugolib.pageOutput)(0xc000c32ea0), (*hugolib.pageOutput)(0xc000c32fc0), (*hugolib.pageOutput)(0xc000c330e0), (*hugolib.pageOutput)(0xc000c33200), (*hugolib.pageOutput)(0xc000c33320), (*hugolib.pageOutput)(0xc000c33440), (*hugolib.pageOutput)(0xc000c33560)}, pageOutput:(*hugolib.pageOutput)(0xc000c32d80), pageCommon:(*hugolib.pageCommon)(0xc00093ef00)} &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c3a900), (*hugolib.pageOutput)(0xc000c3aa20), (*hugolib.pageOutput)(0xc000c3ab40), (*hugolib.pageOutput)(0xc000c3ac60), (*hugolib.pageOutput)(0xc000c3ad80), (*hugolib.pageOutput)(0xc000c3aea0), (*hugolib.pageOutput)(0xc0009da000), (*hugolib.pageOutput)(0xc0009da120)}, pageOutput:(*hugolib.pageOutput)(0xc000c3a900), pageCommon:(*hugolib.pageCommon)(0xc000875400)} &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c3a000), (*hugolib.pageOutput)(0xc000c3a120), (*hugolib.pageOutput)(0xc000c3a240), (*hugolib.pageOutput)(0xc000c3a360), (*hugolib.pageOutput)(0xc000c3a480), (*hugolib.pageOutput)(0xc000c3a5a0), (*hugolib.pageOutput)(0xc000c3a6c0), (*hugolib.pageOutput)(0xc000c3a7e0)}, pageOutput:(*hugolib.pageOutput)(0xc000c3a000), pageCommon:(*hugolib.pageCommon)(0xc000874f00)} &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c21200), (*hugolib.pageOutput)(0xc000c21320), (*hugolib.pageOutput)(0xc000c21440), (*hugolib.pageOutput)(0xc000c21560), (*hugolib.pageOutput)(0xc000c21680), (*hugolib.pageOutput)(0xc000c217a0), (*hugolib.pageOutput)(0xc000c218c0), (*hugolib.pageOutput)(0xc000c219e0)}, pageOutput:(*hugolib.pageOutput)(0xc000c21200), pageCommon:(*hugolib.pageCommon)(0xc00093e000)} &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c21b00), (*hugolib.pageOutput)(0xc000c21c20), (*hugolib.pageOutput)(0xc000c21d40), (*hugolib.pageOutput)(0xc000c21e60), (*hugolib.pageOutput)(0xc000c32000), (*hugolib.pageOutput)(0xc000c32120), (*hugolib.pageOutput)(0xc000c32240), (*hugolib.pageOutput)(0xc000c32360)}, pageOutput:(*hugolib.pageOutput)(0xc000c21b00), pageCommon:(*hugolib.pageCommon)(0xc00093e500)} &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c32480), (*hugolib.pageOutput)(0xc000c325a0), (*hugolib.pageOutput)(0xc000c326c0), (*hugolib.pageOutput)(0xc000c327e0), (*hugolib.pageOutput)(0xc000c32900), (*hugolib.pageOutput)(0xc000c32a20), (*hugolib.pageOutput)(0xc000c32b40), (*hugolib.pageOutput)(0xc000c32c60)}, pageOutput:(*hugolib.pageOutput)(0xc000c32480), pageCommon:(*hugolib.pageCommon)(0xc00093ea00)}

All variables scoped to the current context

    &hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000c33680), (*hugolib.pageOutput)(0xc000c337a0), (*hugolib.pageOutput)(0xc000c338c0), (*hugolib.pageOutput)(0xc000c339e0), (*hugolib.pageOutput)(0xc000c33b00), (*hugolib.pageOutput)(0xc000c33c20), (*hugolib.pageOutput)(0xc000c33d40), (*hugolib.pageOutput)(0xc000c33e60)}, pageOutput:(*hugolib.pageOutput)(0xc000c33680), pageCommon:(*hugolib.pageCommon)(0xc000874a00)}