apple

Mobile Video Gaming

Mobile Video Gaming

I’ve played video games since I was around 5. I started with Mortal Kombat on the Sega Genesis, or Megadrive in Non-US countries. I learned to play more by accident than anything else. It was one of the poorer quality ports of that game, using low quality audio samples, not speaking chosen character names, and not announcing who won a round. The music was great though. Anyway, throughout my life, I’ve always played video games. Of course, during my teenage years I got into audio games and such, but I still played video games because they usually had more content, or simply because there weren’t many fighting games in audio form. Besides, by that time, I was comfortable with the video games I could play. I knew the menus I needed, where my favorite characters were, and stuff like that.

Now, I’ve achieved something I never thought I’d be able to do on mobile devices, playing nearly all my favorite games, from the Playstation Portable, to GameCube and Wii ports of PS2 games I had when I was younger, to playing the actual PS2 games on Android. In this blog post, I’ll talk a bit about how I did it, but mostly about my experiences with playing these games on Android, and even iOS!

Emulation everywhere!

So, a few years ago, I wrote a post about emulation or something or other. I even made a video of me playing Dissidia Final Fantasy on an iPhone XR. At the time, you couldn’t do that without using workarounds like Altstore. Now, around a year ago, Apple began allowing emulators onto the App Store, as long as they didn’t come with any games or bios files. So, we got Retroarch, PPSSPP, Delta, and stuff like that. The best is Retroarch in my opinion, since it’s got the accessibility service that turns on when VoiceOver is on, and has PSP, PS1, Sega consoles, all that, which Delta doesn’t have, and blind people who game on PC are already used to it.

Android, meanwhile, has had emulators for over a decade. And, with TalkBack image descriptions, you can even have parts of games described if the emulator is made right. I’ll get into that later though. Now, you can play PS2 games, Gamecube and Wii, 3DS, and even Windows games, through Wine, on Android. However, I’ve not tried Winlator or the 3DS emulators, so accessibility or security of those are unknown.

Emulating on iOS

So, I’m starting out with this section because iOS is the platform I use most, it’s probably going to get the most attention from gamers, and it’s going to take the longest to write, maybe. So we have Retroarch in the App Store. It works a lot like the PC version, but doesn’t have systems like PS2, Gamecube, Wii, PS3, PS4, PS5, and PS6. It doesn’t even have the Nintendo Switch 2 XL XR XS Max Pro XDR! Boo! But what it does have is a ton of older systems, from the NES and Sega Genesis, to the N64 and PS1. Oh and PSP of course, where all the cool games are for that time. Well besides Dragon Ball Z Budokai Tenkaichi 3.

So, it’s pretty simple to use. I just drop my games into the Download folder, or you can make a games folder within the Retroarch app folder on your phone, and you’re good. Well, besides having to deal with the iOS Files app and its broken copy and paste commands. To get around that, you can find a file, swipe down to Copy, find where you want to paste it to, find an empty part of the screen, turn VoiceOver off, tap and hold, release when you feel a vibration, turn VoiceOver back on, find the Paste button, then double tap. Yes, it’s annoying, and a lot of steps. But we iOS users must cope any way we can, right?

A few days ago though, I found something amazing. You see, newer consoles, like the PS8 and XBox 100X, must use a system called JIT to boost performance to acceptable levels. Well and the PS2 and Gamecube and virtual machines too. However, big mean red or green Apple blocks all apps except theirs from using it, in the name of sEcUrItY. Thankfully, we have alternatives, and it doesn’t involve switching to Android. Well I mean if you can handle it, please do, you can emulate far easier on it. But if you’re like me and Android just isn’t ready for you yet, you can still play!

So, you’ll use two things to deal with all this. Altstore, linked above, and Jitstreamer. Basically, install Altstore, grab Dolphin for iOS for best performance, or Play! For PS2 if you really need it, and set up Jitstreamer. Wireguard VPN, that other package with your phone connected to the PC, upload that file to the webpage, get your Wireguard configuration file and open it in Wireguard from the files app, and get the Shortcut from the website. It’s a few steps, but it’s worth it!

Then, open your Shortcuts app, double tap on Jitstreamer, and choose your app. It may have to download a disk image, and ask for notification permission, but all that is accessible to you, at the top of the screen. Just press the OK button on each message. Once you choose your app, the shortcut will open it for you.

Now, about the PS2. I tried several games, from Soul Calibur 2 to Mortal Kombat Armageddon, and none of them ran at a speed where sound wasn’t choppy. So I’ll keep waiting on that one. Dolphin, however, can play almost ever game perfectly that I’ve tried, besides Soul Calibur 2. So, put your ISO or RVZ files in the Software folder in the Dolphin app folder on your phone. Honestly, I use cloud services, like Nextcloud, to transfer things across from computer to phone. If you have a Mac, perhaps Airdrop will work for you, if it’s feeling amicable that day.

Now, all you need is a controller or a keyboard. In Dolphin, you’ll need to map the controls, and that’s perfectly accessible. Then, just run your game. There will be a single button on the game screen that VoiceOver sees, which will open a menu for pausing the game, stopping the game, hiding those controls, or opening settings.

VoiceOver Recognition and Describe Screenshot

Emulation is great for blind people. It allows the blind to choose their game, use comfortable controls, and use headphones where the original console may not have had a headphone jack, for clearer audio. But it also allows for text recognition and AI descriptions.

So, let’s use that for all it’s worth! When you’re in a game, like Mortal Kombat Armageddon for the Nintendo Wii, you can enable VoiceOver’s screen recognition in order to get a sense of menus, and to read the character selection screen, sometimes. There are times where it doesn’t read a plot, but it usually works well. In games like Dissidia Final Fantasy, you can read character dialog, or what interactable game board piece you’ve found, or story mode titles. And, if you leave it on an item, it updates automatically. So, you can have it read character selections as you move throughout the selectable characters.

The Describe Screenshots shortcut is amazing. It works on any app, on any screen, and uses one of the best AI models available for describing stuff. So, if you want to know what menu item is selected, or what a character or stage in a game looks like, you can just use that. It’s been very helpful if text recognition is not understandable, or doesn’t see everything. Maybe one day VoiceOver will be able to do this on its own, in a good decade or so after everyone else “does it first”… Oh wait, they already have. Well besides Linux. So maybe in 5 years we’ll have VoiceOver using large language models.

So, gaming on iOS, playing real video games that is, has gone from very… scenic, to only a little scenic, or a tad bit more. Not many people know about emulators, but I definitely want more gamers to know that they can indeed play nearly any older game they want, all on their iPhone!

Emulating on Android

Android has it easy. You just download, or sideload, an app, plop games from File Explorer onto your phone, open the app, and play. But how accessible is that? Find out in the next paragraph, because every paragraph should have from three to five sentences. Or six, or eight, or over nine thousand.

So, Retroarch works well. It’s accessibility service is rather slow because it sends announcements through TalkBack, but Android is slow already, so what’s a bit more lag right? I mean, it’s jUsT aS gOod, right? Anyway, I don’t blame Retroarch at all, since on Windows, it speaks through NVDA and that works perfectly fine.

One issue is that it doesn’t expose any onscreen elements, even just a fullscreen game area, to TalkBack. That’s important because with TalkBack 15, we can now have images described through Gemini AI. So that’s nice. Except, the item you want described has to be reachable by TalkBack. So, if TalkBack can’t “see” anything on the screen, it can’t describe what’s on the screen.

There are, however, many powerful warriors throughout the realms. Like AetherSX2! That’s a PS2 emulator that’s basically a kind of port from PCSX2, or something like that, that was forked because the original developer was used all up by users and abusers. So there’s a fork, and it has a fullscreen element that TalkBack can see, so you get image descriptions.

Dolphin works well too, along with Duckstation for PS1 games. So, plenty to play on Android too.

Conclusion

So, in conclusion, Apple should give us Jit cause we already have it anyway, Google should hire tons more blind people to the accessibility teams, open up their trusted tester program so we don’t have Google fanboys give Google what they want to hear while everyone else asks why Android isn’t as fast as iOS on a Samsung Galaxy S25 Ultra, and emulator developers should be commended for so much effort on platforms like iOS which must be an epic pain to develop for and get passed app review. Also more developers should use Altstore. Epic could have for Fortnight by they just want their own bullcrap instead of actually standing for what they sued Apple over.

Categories: accessibility Android apple blindness Google iPhone

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"activitypub":maps.Params{"url":"[ACTIVITYPUB_ACTOR]", "username":"[ACTIVITYPUB_USERNAME]"}, "avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc0007c6410)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc0003596b0), s:(*hugolib.Site)(0xc00069bd40), language:(*langs.Language)(0xc0007c6410), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "has_newsletters":false, "has_podcasts":false, "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1737904549", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2025/02/01/mobile-video-gaming.html" 

		

Params

		map[categories:[Google accessibility blindness Android iPhone apple] date:2025-02-01 23:36:52 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2025/02/01/mobile-video-gaming.html iscjklanguage:%!s(bool=false) lastmod:2025-02-02 06:34:08 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=4866117) publishdate:2025-02-01 23:36:52 -0600 -0600 title:Mobile Video Gaming type:post url:/2025/02/01/mobile-video-gaming.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000914c60), (*hugolib.pageOutput)(0xc000914d80), (*hugolib.pageOutput)(0xc000914ea0), (*hugolib.pageOutput)(0xc000914fc0), (*hugolib.pageOutput)(0xc0009150e0), (*hugolib.pageOutput)(0xc000915200), (*hugolib.pageOutput)(0xc000915320), (*hugolib.pageOutput)(0xc000915440)}, pageOutput:(*hugolib.pageOutput)(0xc000914c60), pageCommon:(*hugolib.pageCommon)(0xc0004d0000)}	
		

Contextual Accessibility

I’m sure all this has been said many times before, but all I’ve been able to find so far is this article on Alt-Text which isn’t what I’m going for here. As a side note , I get that most members of society can only improve accessibility by adding Alt-text to the visual media they create, but isn’t it time we move a bit passed just image descriptions? Guides for Alt-text are everywhere. I am not about to make yet another. This will, however, be a longer article than the usual, and I wouldn’t blame you for coming up for air during this one.

In this article, I’ll talk about accessibility, for blind people, not just in regards to the screen reader, which also gets talked about, but also the operating system, and the activity the blind person is trying to do. In writing this article, I hope to zoom out a bit and show developers and others interested, that accessibility encompasses everything about the device, operating system, accessibility stack, screen reader, app, and person using the device. There is no solitary part of that stack.

How to introduce a screen reader

I know this is going to be an edge case. But honestly, people in the accessibility space should be used to edge cases by now, and this happens much more than you’d think. We have our blind person here: a person who, up to now, has been fully sighted, but lost her sight instantly due to an accident. No, I’m not going into details because there are tons of ways a person can lose their vision. No, this isn’t a real person, and I’m not, in any way, saying that this is how instant sight loss occurs or effects people, but I know that if I lost my sight this way, this is probably what would happen. And I know I’m not the only one that struggles with depression and anxiety.

So this person, let’s name her Anna, is trying to get used to having an entire sense, just gone. Maybe she’s in pain, or maybe her mind is numb in trying to process everything. Or maybe she’s about to give up and find a sharp knife in which to end it all, searching and searching in the now permanent dark. But she remembers learning about Hellen Keller in school. Thinking back though, she didn’t remember covering any more than her childhood. Anna wants to find a book on the rest of Keller’s life, so stores that in the back of her mind, with a note to figure out how to get an audiobook.

Now, you may be considering this amount of personal story to be a bit much, or over the top, or inappropriate for an informational post. This honestly makes blind people into nothing more than a test subject to be observed, and understood on a sort of acquaintance basis. I, for one, am tired of that. I, and every other blind person, am a living, thinking, feeling person. Notice that I didn’t say “blind user” above? No. A user is some case number or hypothetical list of assumptions to be grabbed, accounted for, and discarded at the end. But a person, well, they’re alive. They deal with their everyday trials and triumphs. They breathe and speak and change and grow and learn. A person is a bit less easy to discard, or to mock as “just another ‘luser’”. And yes, I’ve had to learn this too, over my years of teaching blind people

Now, Anna got some help with her mental health, and she’s trying to learn to deal with being blind. She’s learned to feel for the light switch, to see if a light is on or not, since she can’t see them. She’s learned that she needs to keep everything in order, so that she can find things much easier. She’s even started making her own sandwiches, ham and cheese and some mayo, and pouring some cereal, even though she makes a bit of a mess with the milk. She now keeps a towel on the counter, for that reason.

But today, she’s gonna do something that she thinks would be impossible. She feels around on her nightstand for the slab of glass. Her iPhone. She hasn’t touched it since she went blind a week ago. Or has it been a few weeks? A month? She takes a deep breath, and lets it out. She breathes for a moment, then starts to examine the device. A flat screen with two horizontal lines for the earpiece. Buttons on both sides. A flat piece of glass with the camera on the back. You get the idea.

She then decides to try using Siri. Siri wasn’t especially useful to her before, but it’s all she has now. She tries some simple requests, like asking for the time, or the weather. She then decides to be a little more brave. Taking a deep breath, she asks “How can a blind person use an iPhone?” Siri responds, “Here’s what I found on the web. Check it out!”

Anna waits a moment, remembering what this screen looks like. She waits, thinking that maybe Siri would be so kind as to read it out to her. Siri, it turns out, is not so kind. Her room is silent. Tears to not make a sound as they fall.

She sits there for a moment, trying to pull herself together. Trying to gather up the pieces that had just went flying everywhere. She breathes in, and breathes out. She grips the phone in her hands, and tries the last thing she can think of. Because if this doesn’t work, then this expensive phone is nothing but expense to her. She tells Siri to call her mother. “Okay, calling Mom.” Her heart stops to listen. Her mind strains to hear. And the phone finally rings. And, after a moment, she hears her mother’s voice.


Now, this is without any kind of outside agency helping her, and I shudder to think of how many blind people go through this. They use Siri to text and call, read their texts and notifications, create a note and read it out, and set alarms. But apps, well, Youtube and books and anything besides texting, those are lost to them. Yes, having the ability to make a call or send a text is really important. And in this case, it’s probably what made the difference between barely surviving to possibly thriving. Because I’ve known people who come in with their iPhone and only know Siri. And maybe they’ve tried to use VoiceOver. But here’s where that falls apart.


Anna waited for the doctor to hang up the phone. She hated this part. The doctor hung up, but the automated system on which the call happened, well it takes a minute. Finally, she heard the static and click. She’d been noticing more of that kind of thing. How the tone of the milk changed as she filled a cup, or the way her foot falls echoed when she was near a door. She’d kind of started making sounds, clapping her hands or stepping a bit louder, just to experiment with the sounds. Not too much though. She never knew if someone was looking in through the window at that weird blind girl clapping at nothing or stomping her feet.

Anna smiles at the memories, but reminds herself that she’s got a thing to try. Through some miracle, her mom had found an app on the iPhone, called VoiceOver. It was supposed to read out what is on the screen to you, and guide you through tasks on it. So, she told Siri to “Turn on VoiceOver.”

Immediately, VoiceOver turned on. She heard a voice say “VoiceOver on.” And then it read the first app on the Home Screen. She was impressed, but wasn’t sure what to do next. The voice then said “double tap to open.” She tapped the screen twice. It read another app on the screen… Twice.

She continued playing with it, but couldn’t make much sense of it. She could feel around the screen, but then what? When she tapped on an item to open it, it just read that item. She tapped twice like it said, but it just read the item again. In frustration, after hearing “double tap to open” for the hundredth time, she quickly jabbed at the phone. Miraculously, the app opened. She had been touching the weather app. She felt around the screen, hearing about, well, the weather. Exasperated, she put the phone down and took a nap. She deserved it.


And here we have the meat of the issue. Yes, I tried that question to Siri, about how a blind person would use an iPhone, and she gave that “here’s what I found on the web” answer. This is even on iOS 17 beta, where Siri is supposed to be able to read webpages. But Siri can’t even ask the person if Siri should read the page or not. Because that’s just too hard.

Siri could have easily been programmed to tell the person about VoiceOver, and at least give tips on how to use it. But no. If there’s a screen, people inherently can view the screen, and a digital talking assistant shouldn’t get in the sighted person’s way. Right?

So, let’s look at accessibility here, in the context of a voice assistant. The voice assistant should never, ever assume that a person can see the screen. Like, ever. If it’s pulled up a long article, the assistant should offer to read the article. It doesn’t matter if a screen reader is on or not. As you read above, Anna didn’t even know the concept of a screen reader, let alone that one is on the iPhone.

Mobile phones have an amazing accessibility potential specifically because of the voice assistant. This voice assistant is in popular culture already, so a newly blind person can at lease use Siri, or Google Assistant, or even Bixby.

So, an assistant should be able to help a blind person get started with their phone. The digital assistant can lead a blind person to a screen reader, or even read its documentation like a book. If Amazon’s Alexa can read books, then Apple’s Siri can handle reading some VoiceOver documentation.

Moving forward, Anna had no idea how to use VoiceOver. Performing a double tap is incredibly difficult if you don’t have another person there who already knows, and can show you. You may think that it’s just tapping twice on the screen, but you have to do it very quickly compared to a normal tap. A little too slow, and it just thinks you’re trying to tap twice. And even when that’s done, the person has to learn how to swipe around apps, type on the keyboard, answer and end calls, and where the VoiceOver settings are to turn off the hints that are now getting in her way and making her lose focus on what she’s trying to do.

It’s been more than a decade since VoiceOver was shipped. It’s seriously time for a tutorial. And yes, while I know that there are training centers for the blind across the US, I’m pretty sure there are plenty of blind people that never even hear about these centers, or are too embarrassed to go there, or can’t go due to health reasons.

The screen reader, VoiceOver, in this case, was even less helpful than the voice assistant. Imagine that. A voice assistant helped Anna make a phone call. VoiceOver, due to the lack of any kind of usage information, just got in her way. That’s really sad, and it’s something I see a lot. People just turn back to Siri because it’s easier, and it works.

The idea that blind people just “figure it out” is nothing but the shirking of responsibility. Training centers should not have to cover the basics of a screen reader. But we do, and it takes anywhere from two weeks to a month or more of practice, keeping the student from slipping back into the sweet tones of Siri that can barely help them, and showing them all of what VoiceOver can do, which Apple just can’t manage to tell the person. Because it’s so much easier to show off a new feature than to build up the foundations just a tad bit more.

Context matters. Yes, an advanced user won’t need to know all of VoiceOver’s features and how to do things on the phone. But someone who has just gone blind, or someone’s first iPhone, they need this. I don’t care if it’s Siri introducing the screen reader after the person asks how a blind person can use a phone, or if VoiceOver asks the user if they’d like to use it after a good minute or two of being on the home or Lock Screen with no activity, or if a tutorial pops up the first time a blind person uses VoiceOver. But something has got to change, because there are newly blind people every year. It isn’t like we people who were born blind are dying off and no blind people are replacing us, so that VoiceOver looks like this temporary measure until 50 years from now when there are no more blind people left, and VoiceOver can be left to rot and die.

Disabilities, whether we like or not as a society, or even as a disability community, are going to be with us for a long time to come. You see how much money is poured into a big company’s accessibility team? Very little. That’s about how much money is also poured into medical fixes for disability. If it weren’t so, we’d all be seeing, and hearing, and walking, and being neurotypical, and so on, right now. They’ve had a good 40 years of modern medicine. They are not going to fix this in the next 40 years, either. Even if they do fix it for every disability, some people don’t want to typical. They find pride in their disability. That’s who they are.

How a screen reader and operating system work together

For the sake of my readers, the ones left at this point, and myself, I’m going to continue to use Anna for this section as well. Let’s say she has figured out the iPhone thanks to her mom reading articles to her, and now her time to upgrade her phone has arrived. But she’s not found a job yet, and an iPhone is way out of her current budget. Oh, did I tell you she was “gracefully let go” from her old job at Walmart? Yeah, there’s that too. Her bosses just knew that she couldn’t possibly read the labels on the goods she normally organized and put on the shelves, because they don’t have Braille on them.

Now, Anna is looking for a new phone. Her iPhone X is getting slow, and she’d finally paid off the phone with the help of her new SSDI bill, and her mom where needed. She doesn’t like to ask people to do things for her unless absolutely necessary.

So, she goes to a phone store and picks out the best Android phone she can find. It’s not a flagship phone by any means. It’s not even a flagship killer. It’s just a Samsung phone. She pays for it, down a good $250, and walks out with the phone.

Knowing how things generally are now, she holds down the power button, feels a nice little buzz (but not as nice as her iPhone could make), and asks her mom to read her some articles on how to use a Samsung with… whatever VoiceOver it has.

So to make a long story short, she turns on TalkBack, goes through the tutorial built right into the screen reader, and is instantly in love. She can operate the screen reader, without her mom even needing to read her the commands and how to do them. She hits the finish button on the tutorial, and is ready to go.

She first wants to make sure she’s running the latest version of everything. So, she opens the Galaxy Store, and feels around the screen. She finds a link of some kind, then a “don’t show again,” and a “close” button. Puzzled, she activates the close button. Now, she’s put into the actual store. She feels around the screen, finding tabs at the bottom. She’s not really sure what tabs are, but she’s seen them on some iPhone apps. She finds a “menu” tab, and activates that. Feeling around this new screen, she finds the updates item, and double taps on it. Nothing happens. She tries again, thinking she didn’t double tap hard enough, or fast enough, or slow enough. No matter what she tries, nothing happens.

Frustrated and confused, Anna swipes to the left once, and finds a blank area. She decides to double tap it, thinking maybe that’s just how some things work on Android, like how sometimes on a website, a text label is right before a box where the information for that label goes. And she’s right. The updates screen opens, and she’s given a list of updatable apps.

She then decides to look at the Play Store. She’s heard of that before, in ads on the TV and radio, where people are talking about downloading an app. She finds this app a bit easier to use, but also a lot more cluttered. She finds the search item easily, but cannot find the updates section. She decides to just give that a break for now.

She decides to call her mom on her new phone, and see how that works. So, she double taps and holds on the home button to bring up Google Assistant. She knows how to do this because I don’t feel like making up some way she can figure that out. Oh and let’s say her contacts were magically backed up to Google and were restored to her new phone.

TalkBack says “Google Assistant. Tap to dismiss assistant.” Startled, the command Anna was about to give is scattered across her mind. She finds and double taps on the back button. She tries again. The same thing happens. She’d never had that issue before. Trying to talk passed TalkBack, she talks over the voice saying “tap to dismiss assistant,” and says “call”… TalkBack then says “call”. TalkBack then says “double tap to finish.” Assistant then says “Here’s what I found.”

Anna is done for the day. She puts the phone away, and from then on, just uses Samsung’s assistant that I don’t remember how to spell right now. Maybe after a year or two of this, she goes back to an iPhone. Let’s hope she finds out about a training center and gets a job somewhere, and lives happily ever after. Or something.


Having a screen reader work with an operating system requires tons of collaboration. The screen reader team, and indeed, every accessibility team that deals with an operating system, should be in the midst of operating system teams. From user interface elements to how a user is supposed to do things like update apps, or talk to an assistant, these things should be as easy as possible for a person using a screen reader. You, as a developer far removed from the user, don’t know if the person interacting with your software has just lost their vision, or is having a particularly suicidal day, or anything else going on in their lives. It’s not that far of a stretch to say that your software may be the difference between a person giving up, or deciding that maybe being blind isn’t a life-ending event after all.

Let’s go through what happened here. TalkBack, in this case Samsung’s TalkBack that’s still running 13 even though 14 has been out for months now, does have a built-in tutorial. It’s been there for as long as I’ve used it. And that’s extremely commendable for the TalkBack team. Also, they keep updating it.

Now, let’s focus on the Galaxy Store for a moment. An ad wasn’t a big deal in this case. It didn’t jumble up and slow down the screen reader, like some other ads tend to do. It was just a simple link. But in a lot of cases, these ads aren’t readable as ads. They’re just odd things that show up, with buttons to not show the ad again, and to close the ad. It’d be kinda nice if the window read “ad” or something before the ad content.

Now, for the worst thing about the store. The download and update buttons are broken now, with no fix coming any time soon. Operating system developers, even if they’re developing from a base like Android, need accessibility teams including a blind person, Deaf person, and so on. And those disabled people need to have enough power to stop an update that breaks a part of the experience, especially one as important as updating apps.

A screen reader can only read what an app tells it to read, unless there’s AI involved. And for me, there was. When TalkBack landed on that unlabeled item, it said something like “downloads” so I knew to click on that instead of the item label that’s just sitting there outside the actual item. A screen reader is only part of a wider system. A app has to be labeled correctly. An operating system has to be able to tell the screen reader about the item label. And the person has to know how to use the screen reader, and operating system, and app.

And finally, the Google Assistant. Context matters a lot here. What is the user trying to do? Talk to a digital assistant. Would you like it if the person you are talking to suddenly talks over you, and then begins repeating the things you’re saying? No? I thought not. In these kinds of contexts, a screen reader should be silent. The digital assistant, if needed, should be reading any options that appear, like choices for a restaurant or phone number. There should be no doubt here. There are times when a screen reader should be silent, and this is one of the most important. If you are wondering about this, ask a blind person. We do exist, and we’ve had enough of these stupid issues that could be resolved if we are just asked.

Conclusion

Alt-text is just the tip of the iceberg of accessibility. After all, if a person isn’t told how to use the screen reader, what good is Alt-text if they can’t even get to the site to read it? A screen reader is not enough. An app has to be built to be accessible and easy to use, or a person won’t know where to go to do the thing they want. An operating system has to have a good enough accessibility framework to allow a screen reader to, say, turn off its own touch interaction methods to allow a person to play a video game, or play a piano. All of these pieces create the accessibility experience that a person has. And this isn’t even getting into Braille and tactile interfaces.

I hope this post has shown some of what we as blind people experience. After a decade of development, you’d think some of these things, issues, and capability for issues, would be resolved a long time ago. But developers still don’t understand that they’re not developing alone, and that it takes constant cooperation to make things not just accessible as in every button is labeled, but great as in the user is confident in how to use their screen reader, the app they’re in, and the operating system as a whole. Until then, these problems will not be resolved. A company-wide, cultural mindset does not fix itself. Blind people are so used to this that very few even know that they can contact Apple or Google developers, and fewer still think it’ll do any good. They see developers as ultra-smart people in some castle somewhere, way out of mind of the people they develop for, and there’s a reason for that. Developers are going to have to speak with the people that rely on what they build. And not just the people that will tell the developers what they want to hear, either. Developers need to be humble, and never, ever think that what they’re building is anywhere close to good enough, because once they do, they start to dismiss accounts of things not working right, or a need for something better, or different, or more choices. Until these things happen, until people can trust developers to actually be on their side, these kinds of stories will continue, unnoticed and undiscussed.

Categories: accessibility Android apple blindness

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"activitypub":maps.Params{"url":"[ACTIVITYPUB_ACTOR]", "username":"[ACTIVITYPUB_USERNAME]"}, "avatar":"https://micro.blog/devinprater/avatar.jpg", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc0007c6410)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc0003596b0), s:(*hugolib.Site)(0xc00069bd40), language:(*langs.Language)(0xc0007c6410), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "has_newsletters":false, "has_podcasts":false, "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1737904549", "twitter_username":""}
		

Permalink

		"https://devinprater.micro.blog/2023/09/03/contextual-accessibility.html" 

		

Params

		map[categories:[accessibility blindness Android apple] date:2023-09-03 06:41:43 -0600 -0600 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2023/09/03/contextual-accessibility.html iscjklanguage:%!s(bool=false) lastmod:2023-09-03 06:41:43 -0600 -0600 layout:post microblog:%!s(bool=false) post_id:%!s(int=3520717) publishdate:2023-09-03 06:41:43 -0600 -0600 title:Contextual Accessibility type:post url:/2023/09/03/contextual-accessibility.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc00090d9e0), (*hugolib.pageOutput)(0xc00090db00), (*hugolib.pageOutput)(0xc00090dc20), (*hugolib.pageOutput)(0xc00090dd40), (*hugolib.pageOutput)(0xc00090de60), (*hugolib.pageOutput)(0xc000914000), (*hugolib.pageOutput)(0xc000914120), (*hugolib.pageOutput)(0xc000914240)}, pageOutput:(*hugolib.pageOutput)(0xc00090d9e0), pageCommon:(*hugolib.pageCommon)(0xc000b27400)}