I’m sure all this has been said many times before, but all I’ve been able to find so far is this article on Alt-Text which isn’t what I’m going for here. As a side note , I get that most members of society can only improve accessibility by adding Alt-text to the visual media they create, but isn’t it time we move a bit passed just image descriptions? Guides for Alt-text are everywhere. I am not about to make yet another. This will, however, be a longer article than the usual, and I wouldn’t blame you for coming up for air during this one.
In this article, I’ll talk about accessibility, for blind people, not just in regards to the screen reader, which also gets talked about, but also the operating system, and the activity the blind person is trying to do. In writing this article, I hope to zoom out a bit and show developers and others interested, that accessibility encompasses everything about the device, operating system, accessibility stack, screen reader, app, and person using the device. There is no solitary part of that stack.
How to introduce a screen reader
I know this is going to be an edge case. But honestly, people in the accessibility space should be used to edge cases by now, and this happens much more than you’d think. We have our blind person here: a person who, up to now, has been fully sighted, but lost her sight instantly due to an accident. No, I’m not going into details because there are tons of ways a person can lose their vision. No, this isn’t a real person, and I’m not, in any way, saying that this is how instant sight loss occurs or effects people, but I know that if I lost my sight this way, this is probably what would happen. And I know I’m not the only one that struggles with depression and anxiety.
So this person, let’s name her Anna, is trying to get used to having an entire sense, just gone. Maybe she’s in pain, or maybe her mind is numb in trying to process everything. Or maybe she’s about to give up and find a sharp knife in which to end it all, searching and searching in the now permanent dark. But she remembers learning about Hellen Keller in school. Thinking back though, she didn’t remember covering any more than her childhood. Anna wants to find a book on the rest of Keller’s life, so stores that in the back of her mind, with a note to figure out how to get an audiobook.
Now, you may be considering this amount of personal story to be a bit much, or over the top, or inappropriate for an informational post. This honestly makes blind people into nothing more than a test subject to be observed, and understood on a sort of acquaintance basis. I, for one, am tired of that. I, and every other blind person, am a living, thinking, feeling person. Notice that I didn’t say “blind user” above? No. A user is some case number or hypothetical list of assumptions to be grabbed, accounted for, and discarded at the end. But a person, well, they’re alive. They deal with their everyday trials and triumphs. They breathe and speak and change and grow and learn. A person is a bit less easy to discard, or to mock as “just another ‘luser’”. And yes, I’ve had to learn this too, over my years of teaching blind people
Now, Anna got some help with her mental health, and she’s trying to learn to deal with being blind. She’s learned to feel for the light switch, to see if a light is on or not, since she can’t see them. She’s learned that she needs to keep everything in order, so that she can find things much easier. She’s even started making her own sandwiches, ham and cheese and some mayo, and pouring some cereal, even though she makes a bit of a mess with the milk. She now keeps a towel on the counter, for that reason.
But today, she’s gonna do something that she thinks would be impossible. She feels around on her nightstand for the slab of glass. Her iPhone. She hasn’t touched it since she went blind a week ago. Or has it been a few weeks? A month? She takes a deep breath, and lets it out. She breathes for a moment, then starts to examine the device. A flat screen with two horizontal lines for the earpiece. Buttons on both sides. A flat piece of glass with the camera on the back. You get the idea.
She then decides to try using Siri. Siri wasn’t especially useful to her before, but it’s all she has now. She tries some simple requests, like asking for the time, or the weather. She then decides to be a little more brave. Taking a deep breath, she asks “How can a blind person use an iPhone?” Siri responds, “Here’s what I found on the web. Check it out!”
Anna waits a moment, remembering what this screen looks like. She waits, thinking that maybe Siri would be so kind as to read it out to her. Siri, it turns out, is not so kind. Her room is silent. Tears to not make a sound as they fall.
She sits there for a moment, trying to pull herself together. Trying to gather up the pieces that had just went flying everywhere. She breathes in, and breathes out. She grips the phone in her hands, and tries the last thing she can think of. Because if this doesn’t work, then this expensive phone is nothing but expense to her. She tells Siri to call her mother. “Okay, calling Mom.” Her heart stops to listen. Her mind strains to hear. And the phone finally rings. And, after a moment, she hears her mother’s voice.
Now, this is without any kind of outside agency helping her, and I shudder to think of how many blind people go through this. They use Siri to text and call, read their texts and notifications, create a note and read it out, and set alarms. But apps, well, Youtube and books and anything besides texting, those are lost to them. Yes, having the ability to make a call or send a text is really important. And in this case, it’s probably what made the difference between barely surviving to possibly thriving. Because I’ve known people who come in with their iPhone and only know Siri. And maybe they’ve tried to use VoiceOver. But here’s where that falls apart.
Anna waited for the doctor to hang up the phone. She hated this part. The doctor hung up, but the automated system on which the call happened, well it takes a minute. Finally, she heard the static and click. She’d been noticing more of that kind of thing. How the tone of the milk changed as she filled a cup, or the way her foot falls echoed when she was near a door. She’d kind of started making sounds, clapping her hands or stepping a bit louder, just to experiment with the sounds. Not too much though. She never knew if someone was looking in through the window at that weird blind girl clapping at nothing or stomping her feet.
Anna smiles at the memories, but reminds herself that she’s got a thing to try. Through some miracle, her mom had found an app on the iPhone, called VoiceOver. It was supposed to read out what is on the screen to you, and guide you through tasks on it. So, she told Siri to “Turn on VoiceOver.”
Immediately, VoiceOver turned on. She heard a voice say “VoiceOver on.” And then it read the first app on the Home Screen. She was impressed, but wasn’t sure what to do next. The voice then said “double tap to open.” She tapped the screen twice. It read another app on the screen… Twice.
She continued playing with it, but couldn’t make much sense of it. She could feel around the screen, but then what? When she tapped on an item to open it, it just read that item. She tapped twice like it said, but it just read the item again. In frustration, after hearing “double tap to open” for the hundredth time, she quickly jabbed at the phone. Miraculously, the app opened. She had been touching the weather app. She felt around the screen, hearing about, well, the weather. Exasperated, she put the phone down and took a nap. She deserved it.
And here we have the meat of the issue. Yes, I tried that question to Siri, about how a blind person would use an iPhone, and she gave that “here’s what I found on the web” answer. This is even on iOS 17 beta, where Siri is supposed to be able to read webpages. But Siri can’t even ask the person if Siri should read the page or not. Because that’s just too hard.
Siri could have easily been programmed to tell the person about VoiceOver, and at least give tips on how to use it. But no. If there’s a screen, people inherently can view the screen, and a digital talking assistant shouldn’t get in the sighted person’s way. Right?
So, let’s look at accessibility here, in the context of a voice assistant. The voice assistant should never, ever assume that a person can see the screen. Like, ever. If it’s pulled up a long article, the assistant should offer to read the article. It doesn’t matter if a screen reader is on or not. As you read above, Anna didn’t even know the concept of a screen reader, let alone that one is on the iPhone.
Mobile phones have an amazing accessibility potential specifically because of the voice assistant. This voice assistant is in popular culture already, so a newly blind person can at lease use Siri, or Google Assistant, or even Bixby.
So, an assistant should be able to help a blind person get started with their phone. The digital assistant can lead a blind person to a screen reader, or even read its documentation like a book. If Amazon’s Alexa can read books, then Apple’s Siri can handle reading some VoiceOver documentation.
Moving forward, Anna had no idea how to use VoiceOver. Performing a double tap is incredibly difficult if you don’t have another person there who already knows, and can show you. You may think that it’s just tapping twice on the screen, but you have to do it very quickly compared to a normal tap. A little too slow, and it just thinks you’re trying to tap twice. And even when that’s done, the person has to learn how to swipe around apps, type on the keyboard, answer and end calls, and where the VoiceOver settings are to turn off the hints that are now getting in her way and making her lose focus on what she’s trying to do.
It’s been more than a decade since VoiceOver was shipped. It’s seriously time for a tutorial. And yes, while I know that there are training centers for the blind across the US, I’m pretty sure there are plenty of blind people that never even hear about these centers, or are too embarrassed to go there, or can’t go due to health reasons.
The screen reader, VoiceOver, in this case, was even less helpful than the voice assistant. Imagine that. A voice assistant helped Anna make a phone call. VoiceOver, due to the lack of any kind of usage information, just got in her way. That’s really sad, and it’s something I see a lot. People just turn back to Siri because it’s easier, and it works.
The idea that blind people just “figure it out” is nothing but the shirking of responsibility. Training centers should not have to cover the basics of a screen reader. But we do, and it takes anywhere from two weeks to a month or more of practice, keeping the student from slipping back into the sweet tones of Siri that can barely help them, and showing them all of what VoiceOver can do, which Apple just can’t manage to tell the person. Because it’s so much easier to show off a new feature than to build up the foundations just a tad bit more.
Context matters. Yes, an advanced user won’t need to know all of VoiceOver’s features and how to do things on the phone. But someone who has just gone blind, or someone’s first iPhone, they need this. I don’t care if it’s Siri introducing the screen reader after the person asks how a blind person can use a phone, or if VoiceOver asks the user if they’d like to use it after a good minute or two of being on the home or Lock Screen with no activity, or if a tutorial pops up the first time a blind person uses VoiceOver. But something has got to change, because there are newly blind people every year. It isn’t like we people who were born blind are dying off and no blind people are replacing us, so that VoiceOver looks like this temporary measure until 50 years from now when there are no more blind people left, and VoiceOver can be left to rot and die.
Disabilities, whether we like or not as a society, or even as a disability community, are going to be with us for a long time to come. You see how much money is poured into a big company’s accessibility team? Very little. That’s about how much money is also poured into medical fixes for disability. If it weren’t so, we’d all be seeing, and hearing, and walking, and being neurotypical, and so on, right now. They’ve had a good 40 years of modern medicine. They are not going to fix this in the next 40 years, either. Even if they do fix it for every disability, some people don’t want to typical. They find pride in their disability. That’s who they are.
How a screen reader and operating system work together
For the sake of my readers, the ones left at this point, and myself, I’m going to continue to use Anna for this section as well. Let’s say she has figured out the iPhone thanks to her mom reading articles to her, and now her time to upgrade her phone has arrived. But she’s not found a job yet, and an iPhone is way out of her current budget. Oh, did I tell you she was “gracefully let go” from her old job at Walmart? Yeah, there’s that too. Her bosses just knew that she couldn’t possibly read the labels on the goods she normally organized and put on the shelves, because they don’t have Braille on them.
Now, Anna is looking for a new phone. Her iPhone X is getting slow, and she’d finally paid off the phone with the help of her new SSDI bill, and her mom where needed. She doesn’t like to ask people to do things for her unless absolutely necessary.
So, she goes to a phone store and picks out the best Android phone she can find. It’s not a flagship phone by any means. It’s not even a flagship killer. It’s just a Samsung phone. She pays for it, down a good $250, and walks out with the phone.
Knowing how things generally are now, she holds down the power button, feels a nice little buzz (but not as nice as her iPhone could make), and asks her mom to read her some articles on how to use a Samsung with… whatever VoiceOver it has.
So to make a long story short, she turns on TalkBack, goes through the tutorial built right into the screen reader, and is instantly in love. She can operate the screen reader, without her mom even needing to read her the commands and how to do them. She hits the finish button on the tutorial, and is ready to go.
She first wants to make sure she’s running the latest version of everything. So, she opens the Galaxy Store, and feels around the screen. She finds a link of some kind, then a “don’t show again,” and a “close” button. Puzzled, she activates the close button. Now, she’s put into the actual store. She feels around the screen, finding tabs at the bottom. She’s not really sure what tabs are, but she’s seen them on some iPhone apps. She finds a “menu” tab, and activates that. Feeling around this new screen, she finds the updates item, and double taps on it. Nothing happens. She tries again, thinking she didn’t double tap hard enough, or fast enough, or slow enough. No matter what she tries, nothing happens.
Frustrated and confused, Anna swipes to the left once, and finds a blank area. She decides to double tap it, thinking maybe that’s just how some things work on Android, like how sometimes on a website, a text label is right before a box where the information for that label goes. And she’s right. The updates screen opens, and she’s given a list of updatable apps.
She then decides to look at the Play Store. She’s heard of that before, in ads on the TV and radio, where people are talking about downloading an app. She finds this app a bit easier to use, but also a lot more cluttered. She finds the search item easily, but cannot find the updates section. She decides to just give that a break for now.
She decides to call her mom on her new phone, and see how that works. So, she double taps and holds on the home button to bring up Google Assistant. She knows how to do this because I don’t feel like making up some way she can figure that out. Oh and let’s say her contacts were magically backed up to Google and were restored to her new phone.
TalkBack says “Google Assistant. Tap to dismiss assistant.” Startled, the command Anna was about to give is scattered across her mind. She finds and double taps on the back button. She tries again. The same thing happens. She’d never had that issue before. Trying to talk passed TalkBack, she talks over the voice saying “tap to dismiss assistant,” and says “call”… TalkBack then says “call”. TalkBack then says “double tap to finish.” Assistant then says “Here’s what I found.”
Anna is done for the day. She puts the phone away, and from then on, just uses Samsung’s assistant that I don’t remember how to spell right now. Maybe after a year or two of this, she goes back to an iPhone. Let’s hope she finds out about a training center and gets a job somewhere, and lives happily ever after. Or something.
Having a screen reader work with an operating system requires tons of collaboration. The screen reader team, and indeed, every accessibility team that deals with an operating system, should be in the midst of operating system teams. From user interface elements to how a user is supposed to do things like update apps, or talk to an assistant, these things should be as easy as possible for a person using a screen reader. You, as a developer far removed from the user, don’t know if the person interacting with your software has just lost their vision, or is having a particularly suicidal day, or anything else going on in their lives. It’s not that far of a stretch to say that your software may be the difference between a person giving up, or deciding that maybe being blind isn’t a life-ending event after all.
Let’s go through what happened here. TalkBack, in this case Samsung’s TalkBack that’s still running 13 even though 14 has been out for months now, does have a built-in tutorial. It’s been there for as long as I’ve used it. And that’s extremely commendable for the TalkBack team. Also, they keep updating it.
Now, let’s focus on the Galaxy Store for a moment. An ad wasn’t a big deal in this case. It didn’t jumble up and slow down the screen reader, like some other ads tend to do. It was just a simple link. But in a lot of cases, these ads aren’t readable as ads. They’re just odd things that show up, with buttons to not show the ad again, and to close the ad. It’d be kinda nice if the window read “ad” or something before the ad content.
Now, for the worst thing about the store. The download and update buttons are broken now, with no fix coming any time soon. Operating system developers, even if they’re developing from a base like Android, need accessibility teams including a blind person, Deaf person, and so on. And those disabled people need to have enough power to stop an update that breaks a part of the experience, especially one as important as updating apps.
A screen reader can only read what an app tells it to read, unless there’s AI involved. And for me, there was. When TalkBack landed on that unlabeled item, it said something like “downloads” so I knew to click on that instead of the item label that’s just sitting there outside the actual item. A screen reader is only part of a wider system. A app has to be labeled correctly. An operating system has to be able to tell the screen reader about the item label. And the person has to know how to use the screen reader, and operating system, and app.
And finally, the Google Assistant. Context matters a lot here. What is the user trying to do? Talk to a digital assistant. Would you like it if the person you are talking to suddenly talks over you, and then begins repeating the things you’re saying? No? I thought not. In these kinds of contexts, a screen reader should be silent. The digital assistant, if needed, should be reading any options that appear, like choices for a restaurant or phone number. There should be no doubt here. There are times when a screen reader should be silent, and this is one of the most important. If you are wondering about this, ask a blind person. We do exist, and we’ve had enough of these stupid issues that could be resolved if we are just asked.
Conclusion
Alt-text is just the tip of the iceberg of accessibility. After all, if a person isn’t told how to use the screen reader, what good is Alt-text if they can’t even get to the site to read it? A screen reader is not enough. An app has to be built to be accessible and easy to use, or a person won’t know where to go to do the thing they want. An operating system has to have a good enough accessibility framework to allow a screen reader to, say, turn off its own touch interaction methods to allow a person to play a video game, or play a piano. All of these pieces create the accessibility experience that a person has. And this isn’t even getting into Braille and tactile interfaces.
I hope this post has shown some of what we as blind people experience. After a decade of development, you’d think some of these things, issues, and capability for issues, would be resolved a long time ago. But developers still don’t understand that they’re not developing alone, and that it takes constant cooperation to make things not just accessible as in every button is labeled, but great as in the user is confident in how to use their screen reader, the app they’re in, and the operating system as a whole. Until then, these problems will not be resolved. A company-wide, cultural mindset does not fix itself. Blind people are so used to this that very few even know that they can contact Apple or Google developers, and fewer still think it’ll do any good. They see developers as ultra-smart people in some castle somewhere, way out of mind of the people they develop for, and there’s a reason for that. Developers are going to have to speak with the people that rely on what they build. And not just the people that will tell the developers what they want to hear, either. Developers need to be humble, and never, ever think that what they’re building is anywhere close to good enough, because once they do, they start to dismiss accounts of things not working right, or a need for something better, or different, or more choices. Until these things happen, until people can trust developers to actually be on their side, these kinds of stories will continue, unnoticed and undiscussed.