2025

Beyond Parity: The Case for True Accessibility Affordances

There’s a debate in the world of digital accessibility. Should the experience for a screen reader user simply mirror that of a sighted person, achieving functional parity? Or should it strive to be as helpful and efficient as possible, even if it means creating unique capabilities for non-sighted users? I argue for the latter. The goal shouldn’t be mere parity, but the creation of powerful affordances that make digital interfaces genuinely more accessible.

Defining Affordances for Accessibility

In design theory, an “affordance” is a quality of an object or environment that allows an individual to perform an action. A handle affords pulling; a button affords pushing. For this discussion, I want to expand that definition: an accessibility affordance is a feature that provides a non-sighted user with a capability that a sighted user doesn’t typically have, creating a more efficient and powerful user experience.

A good example is the virtual buffer used by screen readers on Windows. It allows a user to select and copy non-editable text from a webpage—something a sighted user can’t do without enabling a special mode like caret Browsing.

This difference is very pronounced on mobile. On an iPhone, VoiceOver’s Text Selection rotor lets me select any text on a webpage. This is really useful for sharing quotes or saving information. On Android, TalkBack lacks this fundamental affordance, forcing users to rely on third-party apps like Universal Copy to achieve the same result.

Affordances Everywhere

On every operating system, affordances are used. On Android, for example, the Google Messages app sends accessibility notifications when a new message is recieved. This allows a blind person to hear a new message in a conversation without having to navigate to it. If the philosophy of no affordances was used, then the blind person would only hear the “incoming message” sound. Then, the user would manually feel around the screen to read new messages.

MacOS and iOS are full of affordances. VoiceOver on MacOS, even though it’s an abomination of hacks sitting on top of another mess, has things like the Application and Windows switchers. Even though MacOS allows you to press Command + Tab to switch between applications and Command + Accent to switch between open windows, you can also open a list of running applications, or a list of open windows in that application using VoiceOver. This allows you to set focus to system windows, like a problem report.

On iOS, in the Books app, a blind person can swipe down with two fingers when focused on the page of a book, and VoiceOver will begin reading the book, automatically turning pages, and continuing to read. It will continue only until the end of the chapter, which I think is a bug. In the Mail app, when you open a conversation with many messages, you can use the Messages rotor to swipe between them, which skips all the headers and action buttons for each and every message. None of this is possible on Google Play Books and Gmail for Android.

Now, Android has a few stand-out affordances. Face in View, only available on Pixel phones, allows a blind person to take a selfie by giving them hints on where to move the camera, and when the face is in view, even takes the picture for them. Not even iOS takes a picture automatically. When using your fingerprint to unlock your device, Android gives you instructions on where to move your finger to find the fingerprint sensor. This works on Pixels, and will work on OneUI 8 when released, but doesn’t work on the OnePlus 13 because they broke it and it’s not a priority for them to fix. TalkBack also comes with the ability to have images described using Gemini.

So, we’ve established that every operating system comes with affordances for accessibility built in. But what about parts that don’t? I’ve given a few already, but I will make it clear in the next section.

When Affordances aren’t used

Google, and OnePlus, make it way too easy to show how the philosophy of accessibility differs between people who seem to let the OS, accessibility frameworks, and apps do all the work, and Apple, which seemingly tries to “script” VoiceOver into working with as much of the OS and first-party apps, as possible.

Let’s take a look at AI. It’s the big thing everyone is focused on, especially Google. Gemini is their star AI product. So, how does it do with accessibility? Well, it’s mixed between meh and awful.

On Android, Gemini with TalkBack works like this. If you speak to Gemini by holding down the side button, things generally work well. That is, unless you touch the screen. Then TalkBack speaks, and Gemini stops speaking. So if you accidentally touch an item, you’ll need to have TalkBack read the response, which requires more feeling around the screen.

If you type to Gemini, Gemini will not send the response to TalkBack to be spoken, and will not speak it itself. Instead, you’ll have to figure out a way to know when a response is ready to be read. A blind person has said that sighted people do not know when Gemini is done generating a response either. I believe that a response being there is a good enough way to know as any. An affordence would be for Gemini to send the response as an accessibility announcement to TalkBack to be read. The Copilot app does this, and that’s why I even use it.

On the desktop and iOS, Gemini says that it has replied, even though it is still generating a response. It has been this way for at least a year, and shows no sign of improving.

Why develop Affordances?

Developing these features is not about adding “bells and whistles.” It’s about efficiency, respect for the user’s time, and a deeper understanding of the non-visual user experience.

Let’s take the Notification Shade on Android for example. After each notification is an expand button, however this could be an Accessibility Action, which would firstly allow a blind person to swipe through their notifications quicker. Each notification would take up one element instead of two. Second, they can expand the notification with an action, or if it’s a grouped notification, by simply double tapping. Let’s take Gmail next. If I open a conversation of about 10 messages, the only way to read through them all is to swipe through each and every single message. The header text, and then the body, and then the action buttons, for each and every message. I might could scroll, but then I might miss one. So, having Accessibility Actions to move to the previous or next message in the conversation would make things much, much faster. iOS already has this, in its built-in Mail app.

The TalkBack screen reader itself could be so much more powerful. Why can’t it select text in a web view? Why isn’t there a built-in OCR feature that can scan the screen for inaccessible elements and make them navigable, as screen readers on Windows and iOS have done for years? These are not edge cases; they are fundamental tools for independence and efficiency.

Conclusion

Given these criticisms, one might ask why I continue to use Android. The answer is that no platform is perfect, and my choice is a pragmatic one based on a series of trade-offs. The responsiveness of TalkBack on my OnePlus device, superior video game emulation, longer battery life on my peripherals, and the convenience of Google Messages for Web are features that I value highly.

This personal calculus, however, only reinforces the central point: we must advocate for better accessibility on all platforms. By embracing a philosophy of creating powerful, intelligent affordances, developers can move beyond the baseline of “making it work” and start building experiences that are truly empowering.

Categories: accessibility Android apple blindness Google iPhone productivity

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"activitypub":maps.Params{"url":"[ACTIVITYPUB_ACTOR]", "username":"[ACTIVITYPUB_USERNAME]"}, "avatar":"https://www.gravatar.com/avatar/9343beaceee5adfd5722805b7ce72987?s=96&d=https%3A%2F%2Fmicro.blog%2Fimages%2Fblank_avatar.png", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc0004d8b60)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc00014d600), s:(*hugolib.Site)(0xc0000ec000), language:(*langs.Language)(0xc0004d8b60), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "has_newsletters":false, "has_podcasts":false, "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1753543170", "twitter_username":"", "years":[]interface {}{"2025", "2024", "2023", "2022"}}
		

Permalink

		"https://devinprater.micro.blog/2025/08/13/beyond-parity-the-case-for.html" 

		

Params

		map[categories:[Google accessibility blindness Android productivity iPhone apple] custom_summary:%!s(bool=false) date:2025-08-13 11:11:48 -0500 -0500 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2025/08/13/beyond-parity-the-case-for.html iscjklanguage:%!s(bool=false) lastmod:2025-08-13 11:11:48 -0500 -0500 layout:post microblog:%!s(bool=false) post_id:%!s(int=5521949) publishdate:2025-08-13 11:11:48 -0500 -0500 summary: title:Beyond Parity: The Case for True Accessibility Affordances type:post url:/2025/08/13/beyond-parity-the-case-for.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000afe6c0), (*hugolib.pageOutput)(0xc000afe7e0), (*hugolib.pageOutput)(0xc000afe900), (*hugolib.pageOutput)(0xc000afea20), (*hugolib.pageOutput)(0xc000afeb40), (*hugolib.pageOutput)(0xc000afec60), (*hugolib.pageOutput)(0xc000afed80), (*hugolib.pageOutput)(0xc000afeea0)}, pageOutput:(*hugolib.pageOutput)(0xc000afe6c0), pageCommon:(*hugolib.pageCommon)(0xc000bc8500)}	
		

Mobile Video Gaming

Mobile Video Gaming

I’ve played video games since I was around 5. I started with Mortal Kombat on the Sega Genesis, or Megadrive in Non-US countries. I learned to play more by accident than anything else. It was one of the poorer quality ports of that game, using low quality audio samples, not speaking chosen character names, and not announcing who won a round. The music was great though. Anyway, throughout my life, I’ve always played video games. Of course, during my teenage years I got into audio games and such, but I still played video games because they usually had more content, or simply because there weren’t many fighting games in audio form. Besides, by that time, I was comfortable with the video games I could play. I knew the menus I needed, where my favorite characters were, and stuff like that.

Now, I’ve achieved something I never thought I’d be able to do on mobile devices, playing nearly all my favorite games, from the Playstation Portable, to GameCube and Wii ports of PS2 games I had when I was younger, to playing the actual PS2 games on Android. In this blog post, I’ll talk a bit about how I did it, but mostly about my experiences with playing these games on Android, and even iOS!

Emulation everywhere!

So, a few years ago, I wrote a post about emulation or something or other. I even made a video of me playing Dissidia Final Fantasy on an iPhone XR. At the time, you couldn’t do that without using workarounds like Altstore. Now, around a year ago, Apple began allowing emulators onto the App Store, as long as they didn’t come with any games or bios files. So, we got Retroarch, PPSSPP, Delta, and stuff like that. The best is Retroarch in my opinion, since it’s got the accessibility service that turns on when VoiceOver is on, and has PSP, PS1, Sega consoles, all that, which Delta doesn’t have, and blind people who game on PC are already used to it.

Android, meanwhile, has had emulators for over a decade. And, with TalkBack image descriptions, you can even have parts of games described if the emulator is made right. I’ll get into that later though. Now, you can play PS2 games, Gamecube and Wii, 3DS, and even Windows games, through Wine, on Android. However, I’ve not tried Winlator or the 3DS emulators, so accessibility or security of those are unknown.

Emulating on iOS

So, I’m starting out with this section because iOS is the platform I use most, it’s probably going to get the most attention from gamers, and it’s going to take the longest to write, maybe. So we have Retroarch in the App Store. It works a lot like the PC version, but doesn’t have systems like PS2, Gamecube, Wii, PS3, PS4, PS5, and PS6. It doesn’t even have the Nintendo Switch 2 XL XR XS Max Pro XDR! Boo! But what it does have is a ton of older systems, from the NES and Sega Genesis, to the N64 and PS1. Oh and PSP of course, where all the cool games are for that time. Well besides Dragon Ball Z Budokai Tenkaichi 3.

So, it’s pretty simple to use. I just drop my games into the Download folder, or you can make a games folder within the Retroarch app folder on your phone, and you’re good. Well, besides having to deal with the iOS Files app and its broken copy and paste commands. To get around that, you can find a file, swipe down to Copy, find where you want to paste it to, find an empty part of the screen, turn VoiceOver off, tap and hold, release when you feel a vibration, turn VoiceOver back on, find the Paste button, then double tap. Yes, it’s annoying, and a lot of steps. But we iOS users must cope any way we can, right?

A few days ago though, I found something amazing. You see, newer consoles, like the PS8 and XBox 100X, must use a system called JIT to boost performance to acceptable levels. Well and the PS2 and Gamecube and virtual machines too. However, big mean red or green Apple blocks all apps except theirs from using it, in the name of sEcUrItY. Thankfully, we have alternatives, and it doesn’t involve switching to Android. Well I mean if you can handle it, please do, you can emulate far easier on it. But if you’re like me and Android just isn’t ready for you yet, you can still play!

So, you’ll use two things to deal with all this. Altstore, linked above, and Jitstreamer. Basically, install Altstore, grab Dolphin for iOS for best performance, or Play! For PS2 if you really need it, and set up Jitstreamer. Wireguard VPN, that other package with your phone connected to the PC, upload that file to the webpage, get your Wireguard configuration file and open it in Wireguard from the files app, and get the Shortcut from the website. It’s a few steps, but it’s worth it!

Then, open your Shortcuts app, double tap on Jitstreamer, and choose your app. It may have to download a disk image, and ask for notification permission, but all that is accessible to you, at the top of the screen. Just press the OK button on each message. Once you choose your app, the shortcut will open it for you.

Now, about the PS2. I tried several games, from Soul Calibur 2 to Mortal Kombat Armageddon, and none of them ran at a speed where sound wasn’t choppy. So I’ll keep waiting on that one. Dolphin, however, can play almost ever game perfectly that I’ve tried, besides Soul Calibur 2. So, put your ISO or RVZ files in the Software folder in the Dolphin app folder on your phone. Honestly, I use cloud services, like Nextcloud, to transfer things across from computer to phone. If you have a Mac, perhaps Airdrop will work for you, if it’s feeling amicable that day.

Now, all you need is a controller or a keyboard. In Dolphin, you’ll need to map the controls, and that’s perfectly accessible. Then, just run your game. There will be a single button on the game screen that VoiceOver sees, which will open a menu for pausing the game, stopping the game, hiding those controls, or opening settings.

VoiceOver Recognition and Describe Screenshot

Emulation is great for blind people. It allows the blind to choose their game, use comfortable controls, and use headphones where the original console may not have had a headphone jack, for clearer audio. But it also allows for text recognition and AI descriptions.

So, let’s use that for all it’s worth! When you’re in a game, like Mortal Kombat Armageddon for the Nintendo Wii, you can enable VoiceOver’s screen recognition in order to get a sense of menus, and to read the character selection screen, sometimes. There are times where it doesn’t read a plot, but it usually works well. In games like Dissidia Final Fantasy, you can read character dialog, or what interactable game board piece you’ve found, or story mode titles. And, if you leave it on an item, it updates automatically. So, you can have it read character selections as you move throughout the selectable characters.

The Describe Screenshots shortcut is amazing. It works on any app, on any screen, and uses one of the best AI models available for describing stuff. So, if you want to know what menu item is selected, or what a character or stage in a game looks like, you can just use that. It’s been very helpful if text recognition is not understandable, or doesn’t see everything. Maybe one day VoiceOver will be able to do this on its own, in a good decade or so after everyone else “does it first”… Oh wait, they already have. Well besides Linux. So maybe in 5 years we’ll have VoiceOver using large language models.

So, gaming on iOS, playing real video games that is, has gone from very… scenic, to only a little scenic, or a tad bit more. Not many people know about emulators, but I definitely want more gamers to know that they can indeed play nearly any older game they want, all on their iPhone!

Emulating on Android

Android has it easy. You just download, or sideload, an app, plop games from File Explorer onto your phone, open the app, and play. But how accessible is that? Find out in the next paragraph, because every paragraph should have from three to five sentences. Or six, or eight, or over nine thousand.

So, Retroarch works well. It’s accessibility service is rather slow because it sends announcements through TalkBack, but Android is slow already, so what’s a bit more lag right? I mean, it’s jUsT aS gOod, right? Anyway, I don’t blame Retroarch at all, since on Windows, it speaks through NVDA and that works perfectly fine.

One issue is that it doesn’t expose any onscreen elements, even just a fullscreen game area, to TalkBack. That’s important because with TalkBack 15, we can now have images described through Gemini AI. So that’s nice. Except, the item you want described has to be reachable by TalkBack. So, if TalkBack can’t “see” anything on the screen, it can’t describe what’s on the screen.

There are, however, many powerful warriors throughout the realms. Like AetherSX2! That’s a PS2 emulator that’s basically a kind of port from PCSX2, or something like that, that was forked because the original developer was used all up by users and abusers. So there’s a fork, and it has a fullscreen element that TalkBack can see, so you get image descriptions.

Dolphin works well too, along with Duckstation for PS1 games. So, plenty to play on Android too.

Conclusion

So, in conclusion, Apple should give us Jit cause we already have it anyway, Google should hire tons more blind people to the accessibility teams, open up their trusted tester program so we don’t have Google fanboys give Google what they want to hear while everyone else asks why Android isn’t as fast as iOS on a Samsung Galaxy S25 Ultra, and emulator developers should be commended for so much effort on platforms like iOS which must be an epic pain to develop for and get passed app review. Also more developers should use Altstore. Epic could have for Fortnight by they just want their own bullcrap instead of actually standing for what they sued Apple over.

Categories: accessibility Android apple blindness Google iPhone

Published:

Debug

Site Properties

		&hugolib.SiteInfo{Authors:page.AuthorList(nil), Social:hugolib.SiteSocial(nil), hugoInfo:hugo.Info{CommitHash:"1798bd3f", BuildDate:"2021-12-23T15:33:34Z", Environment:"production"}, title:"Devin Prater's blog", RSSLink:"https://devinprater.micro.blog/feed.xml", Author:map[string]interface {}{"activitypub":maps.Params{"url":"[ACTIVITYPUB_ACTOR]", "username":"[ACTIVITYPUB_USERNAME]"}, "avatar":"https://www.gravatar.com/avatar/9343beaceee5adfd5722805b7ce72987?s=96&d=https%3A%2F%2Fmicro.blog%2Fimages%2Fblank_avatar.png", "name":"Devin Prater", "username":"devinprater"}, LanguageCode:"en", Copyright:"", permalinks:map[string]string{}, LanguagePrefix:"", Languages:langs.Languages{(*langs.Language)(0xc0004d8b60)}, BuildDrafts:false, canonifyURLs:false, relativeURLs:false, uglyURLs:(func(page.Page) bool)(0x163ea20), owner:(*hugolib.HugoSites)(0xc00014d600), s:(*hugolib.Site)(0xc0000ec000), language:(*langs.Language)(0xc0004d8b60), defaultContentLanguageInSubdir:false, sectionPagesMenu:""} 

		

site.Params Properties

		maps.Params{"description":"Follow <a href=\"https://micro.blog/devinprater\">@devinprater on Micro.blog</a>.", "feeds":maps.Params{"bookmarks_json":"https://micro.blog/feeds/devinprater/bookmarks/.json"}, "github_username":"", "has_newsletters":false, "has_podcasts":false, "include_conversation":false, "instagram_username":"", "itunes_author":"Devin Prater", "itunes_category":"Society & Culture", "itunes_cover":"https://micro.blog/devinprater/podcast.png", "itunes_description":"I am a blind person that is driven by accessibility and good design of accessible systems. I blog about my experiences with operating systems and platforms, screen readers and programs, ideas and implementations of accessibility.", "itunes_email":"", "itunes_subcategory":"Personal Journals", "mainSections":[]string{"2022"}, "mainsections":[]string{"2022"}, "paginate_categories":false, "paginate_home":true, "paginate_replies":false, "plugins_css":[]interface {}{}, "plugins_html":[]interface {}{"lite-youtube.html"}, "plugins_js":[]interface {}{}, "post_append_class":"post-content", "post_class":"post-content", "reply_by_email_address":"r.d.t.prater@gmail.com", "reply_by_email_link_text":"✍️ Reply by email", "reply_by_email_show_plain":true, "reply_by_email_show_title":true, "reply_by_email_subject_prefix":"Re: ", "site_id":"94785", "theme_seconds":"1753543170", "twitter_username":"", "years":[]interface {}{"2025", "2024", "2023", "2022"}}
		

Permalink

		"https://devinprater.micro.blog/2025/02/01/mobile-video-gaming.html" 

		

Params

		map[categories:[Google accessibility blindness Android iPhone apple] custom_summary:%!s(bool=false) date:2025-02-02 00:36:52 -0500 -0500 draft:%!s(bool=false) guid:http://devinprater.micro.blog/2025/02/01/mobile-video-gaming.html iscjklanguage:%!s(bool=false) lastmod:2025-02-02 07:34:08 -0500 -0500 layout:post microblog:%!s(bool=false) post_id:%!s(int=4866117) publishdate:2025-02-02 00:36:52 -0500 -0500 summary: title:Mobile Video Gaming type:post url:/2025/02/01/mobile-video-gaming.html]
		

All variables scoped to the current context

		&hugolib.pageState{pageOutputs:[]*hugolib.pageOutput{(*hugolib.pageOutput)(0xc000af3d40), (*hugolib.pageOutput)(0xc000af3e60), (*hugolib.pageOutput)(0xc000afe000), (*hugolib.pageOutput)(0xc000afe120), (*hugolib.pageOutput)(0xc000afe240), (*hugolib.pageOutput)(0xc000afe360), (*hugolib.pageOutput)(0xc000afe480), (*hugolib.pageOutput)(0xc000afe5a0)}, pageOutput:(*hugolib.pageOutput)(0xc000af3d40), pageCommon:(*hugolib.pageCommon)(0xc000bc8000)}