I’ve played video games since I was around 5. I started with Mortal Kombat on the Sega Genesis, or Megadrive in Non-US countries. I learned to play more by accident than anything else. It was one of the poorer quality ports of that game, using low quality audio samples, not speaking chosen character names, and not announcing who won a round. The music was great though. Anyway, throughout my life, I’ve always played video games. Of course, during my teenage years I got into audio games and such, but I still played video games because they usually had more content, or simply because there weren’t many fighting games in audio form. Besides, by that time, I was comfortable with the video games I could play. I knew the menus I needed, where my favorite characters were, and stuff like that.
Now, I’ve achieved something I never thought I’d be able to do on mobile devices, playing nearly all my favorite games, from the Playstation Portable, to GameCube and Wii ports of PS2 games I had when I was younger, to playing the actual PS2 games on Android. In this blog post, I’ll talk a bit about how I did it, but mostly about my experiences with playing these games on Android, and even iOS!
Emulation everywhere!
So, a few years ago, I wrote a post about emulation or something or other. I even made a video of me playing Dissidia Final Fantasy on an iPhone XR. At the time, you couldn’t do that without using workarounds like Altstore. Now, around a year ago, Apple began allowing emulators onto the App Store, as long as they didn’t come with any games or bios files. So, we got Retroarch, PPSSPP, Delta, and stuff like that. The best is Retroarch in my opinion, since it’s got the accessibility service that turns on when VoiceOver is on, and has PSP, PS1, Sega consoles, all that, which Delta doesn’t have, and blind people who game on PC are already used to it.
Android, meanwhile, has had emulators for over a decade. And, with TalkBack image descriptions, you can even have parts of games described if the emulator is made right. I’ll get into that later though. Now, you can play PS2 games, Gamecube and Wii, 3DS, and even Windows games, through Wine, on Android. However, I’ve not tried Winlator or the 3DS emulators, so accessibility or security of those are unknown.
Emulating on iOS
So, I’m starting out with this section because iOS is the platform I use most, it’s probably going to get the most attention from gamers, and it’s going to take the longest to write, maybe. So we have Retroarch in the App Store. It works a lot like the PC version, but doesn’t have systems like PS2, Gamecube, Wii, PS3, PS4, PS5, and PS6. It doesn’t even have the Nintendo Switch 2 XL XR XS Max Pro XDR! Boo! But what it does have is a ton of older systems, from the NES and Sega Genesis, to the N64 and PS1. Oh and PSP of course, where all the cool games are for that time. Well besides Dragon Ball Z Budokai Tenkaichi 3.
So, it’s pretty simple to use. I just drop my games into the Download folder, or you can make a games folder within the Retroarch app folder on your phone, and you’re good. Well, besides having to deal with the iOS Files app and its broken copy and paste commands. To get around that, you can find a file, swipe down to Copy, find where you want to paste it to, find an empty part of the screen, turn VoiceOver off, tap and hold, release when you feel a vibration, turn VoiceOver back on, find the Paste button, then double tap. Yes, it’s annoying, and a lot of steps. But we iOS users must cope any way we can, right?
A few days ago though, I found something amazing. You see, newer consoles, like the PS8 and XBox 100X, must use a system called JIT to boost performance to acceptable levels. Well and the PS2 and Gamecube and virtual machines too. However, big mean red or green Apple blocks all apps except theirs from using it, in the name of sEcUrItY. Thankfully, we have alternatives, and it doesn’t involve switching to Android. Well I mean if you can handle it, please do, you can emulate far easier on it. But if you’re like me and Android just isn’t ready for you yet, you can still play!
So, you’ll use two things to deal with all this. Altstore, linked above, and Jitstreamer. Basically, install Altstore, grab Dolphin for iOS for best performance, or Play! For PS2 if you really need it, and set up Jitstreamer. Wireguard VPN, that other package with your phone connected to the PC, upload that file to the webpage, get your Wireguard configuration file and open it in Wireguard from the files app, and get the Shortcut from the website. It’s a few steps, but it’s worth it!
Then, open your Shortcuts app, double tap on Jitstreamer, and choose your app. It may have to download a disk image, and ask for notification permission, but all that is accessible to you, at the top of the screen. Just press the OK button on each message. Once you choose your app, the shortcut will open it for you.
Now, about the PS2. I tried several games, from Soul Calibur 2 to Mortal Kombat Armageddon, and none of them ran at a speed where sound wasn’t choppy. So I’ll keep waiting on that one. Dolphin, however, can play almost ever game perfectly that I’ve tried, besides Soul Calibur 2. So, put your ISO or RVZ files in the Software folder in the Dolphin app folder on your phone. Honestly, I use cloud services, like Nextcloud, to transfer things across from computer to phone. If you have a Mac, perhaps Airdrop will work for you, if it’s feeling amicable that day.
Now, all you need is a controller or a keyboard. In Dolphin, you’ll need to map the controls, and that’s perfectly accessible. Then, just run your game. There will be a single button on the game screen that VoiceOver sees, which will open a menu for pausing the game, stopping the game, hiding those controls, or opening settings.
VoiceOver Recognition and Describe Screenshot
Emulation is great for blind people. It allows the blind to choose their game, use comfortable controls, and use headphones where the original console may not have had a headphone jack, for clearer audio. But it also allows for text recognition and AI descriptions.
So, let’s use that for all it’s worth! When you’re in a game, like Mortal Kombat Armageddon for the Nintendo Wii, you can enable VoiceOver’s screen recognition in order to get a sense of menus, and to read the character selection screen, sometimes. There are times where it doesn’t read a plot, but it usually works well. In games like Dissidia Final Fantasy, you can read character dialog, or what interactable game board piece you’ve found, or story mode titles. And, if you leave it on an item, it updates automatically. So, you can have it read character selections as you move throughout the selectable characters.
The Describe Screenshots shortcut is amazing. It works on any app, on any screen, and uses one of the best AI models available for describing stuff. So, if you want to know what menu item is selected, or what a character or stage in a game looks like, you can just use that. It’s been very helpful if text recognition is not understandable, or doesn’t see everything. Maybe one day VoiceOver will be able to do this on its own, in a good decade or so after everyone else “does it first”… Oh wait, they already have. Well besides Linux. So maybe in 5 years we’ll have VoiceOver using large language models.
So, gaming on iOS, playing real video games that is, has gone from very… scenic, to only a little scenic, or a tad bit more. Not many people know about emulators, but I definitely want more gamers to know that they can indeed play nearly any older game they want, all on their iPhone!
Emulating on Android
Android has it easy. You just download, or sideload, an app, plop games from File Explorer onto your phone, open the app, and play. But how accessible is that? Find out in the next paragraph, because every paragraph should have from three to five sentences. Or six, or eight, or over nine thousand.
So, Retroarch works well. It’s accessibility service is rather slow because it sends announcements through TalkBack, but Android is slow already, so what’s a bit more lag right? I mean, it’s jUsT aS gOod, right? Anyway, I don’t blame Retroarch at all, since on Windows, it speaks through NVDA and that works perfectly fine.
One issue is that it doesn’t expose any onscreen elements, even just a fullscreen game area, to TalkBack. That’s important because with TalkBack 15, we can now have images described through Gemini AI. So that’s nice. Except, the item you want described has to be reachable by TalkBack. So, if TalkBack can’t “see” anything on the screen, it can’t describe what’s on the screen.
There are, however, many powerful warriors throughout the realms. Like AetherSX2! That’s a PS2 emulator that’s basically a kind of port from PCSX2, or something like that, that was forked because the original developer was used all up by users and abusers. So there’s a fork, and it has a fullscreen element that TalkBack can see, so you get image descriptions.
Dolphin works well too, along with Duckstation for PS1 games. So, plenty to play on Android too.
Conclusion
So, in conclusion, Apple should give us Jit cause we already have it anyway, Google should hire tons more blind people to the accessibility teams, open up their trusted tester program so we don’t have Google fanboys give Google what they want to hear while everyone else asks why Android isn’t as fast as iOS on a Samsung Galaxy S25 Ultra, and emulator developers should be commended for so much effort on platforms like iOS which must be an epic pain to develop for and get passed app review. Also more developers should use Altstore. Epic could have for Fortnight by they just want their own bullcrap instead of actually standing for what they sued Apple over.
I’ve talked a lot about Android on this blog. I love the idea, and the operating
system is nice, but accessibility could be so much more. I’ve had my Galaxy S20 FE
(5G), for about a year and a half or so. I’ve seen the changes from Android 11,
to 12, and finally to 13. TalkBack has improved steadily over that year and a
half, adding Braille support, which I thought wouldn’t come for another five to
ten years. Spell checking was added in TalkBack 13. Text recognition in images
and unlabeled buttons was added in TalkBack 12.1. Icon descriptions were added
in TalkBack 13.
In this article, though, I’ll overview the changes in TalkBack 14, the ones I
have access to, that is. I’ll get to that later. I’ll also talk about the
problems facing Android that isn’t really about TalkBack, but is more about the
accessibility framework, and what apps can and can’t do. So this will be a sort
of continuation from my other Android articles, more than just a “what’s new in
TalkBack” style article.
TalkBack 14, Lots of Braille
TalkBack 14 is a good iteration of where TalkBack 13 started. TalkBack now has
many more commands, both for Braille displays and for its Braille keyboard. One
can now move around an edit field by words and lines, not just characters, using
the onscreen Braille keyboard. One can also select text, and copy and paste it
using the same keyboard. You don’t have to dismiss the keyboard just to do all
that. To be fair to iOS, you can do that with Braille Screen Input, but the
commands are not documented in either Apple’s documentation, or in the VoiceOver
settings. In TalkBack settings, those commands are clearly documented and
explained.
TalkBack 14 now supports the NLS EReader, which is being freely distributed to
NLS patrons. By the end of the year, all 50 states will have the EReader. You
do have to connect the display to your phone via USB C, and the cable I had on
hand shorted out, so I have to find another one. But I was able to use it with a
USB hub, which further made the setup less mobile, but it did work. The
commands, though, were rather more complicated than I expected. I had to press
Enter with dots 4-5 to move to the next object. Space with Dot 4 was used to
move to the next line, and Space with Dot 1 was used to move to the previous
line. So I quickly moved back to using the EReader with the iPhone. I’ll
practice with it more, but for now it just doesn’t feel as practical as using
the EReader, over Bluetooth, on the iPhone, with its simpler commands.
A window into Images
TalkBack 14 has a new screen of choices, where you can enable options regarding
image recognition. You have the usual text recognition, and icon recognition,
but the screen refers also to “image recognition,” similar to what VoiceOver can
do. This is something I’ve wanted for a long time. Some people have a third
option, “image descriptions,” but I don’t have that option. Google often rolls
out features to a small subset of users, and then rolls it out to everyone else
after weeks or months of testing. We’ll have to see how that works out.
Of note, though, is that whenever one gets an iOS update, one gets all the new
features right away. There is no rollout of features for VoiceOver, it’s just
there. TalkBack 14, as a public release, should have all the features available
to everyone at launch, in my oppinion. They could always label image
descriptions as “beta.”
The Accessibility Framework
As I’ve said before, the operating system is the root of all accessibility. If
the accessibility framework is limited, then apps are limited in what they can
do as far as accessibility is concerned. This is why I’ve been so critical of
Google, because Android’s accessibility framework, and what apps can communicate
to TalkBack, is limited. I’ll give a few examples.
Kindle
I love the books I can get on Kindle. I love that I can read them on just about
all of my devices. But not all Kindle apps are created equally. The app on the
iPhone is great. Using VoiceOver, I just swipe down with two fingers and the
book is read to me. I can move my finger up and down the screen to read by line.
I can use a Braille display and just scroll through the book, no turning pages
required since it happens automatically. On Android, however, the Kindle app is
more limited.
When you open a book in Kindle for Android, you find a page, with a “start
continuous reading” button. All this button does is pipe the text of the book
out to the Android speech engine. This distinction is important. On iOS, since
VoiceOver is controlling things, you can quickly speed up, slow down, pause and
resume, or change the voice quickly. On iOS, you can read by word or letter, and
most importantly, read easily with a Braille display.
On Android, you can move your finger down the page to hear lines of text, which
are sent to TalkBack as announcements. But if you try to have TalkBack read the
book, it won’t get passed the current page. The same is even more true with
Braille; you have to turn pages manually, using the touch screen because its not
actually TalkBack that’s turning the page. So you have to keep touching the
phone’s touch screen in order to continue interacting with the app. Braille
displays have keys for a reason. You shouldn’t have to use the touch screen to
do anything while using a Braille display with your phone. Most Braille display
users keep their phone in their pocket while they use it from their displays.
With a lot of other book-reading apps, you can just lock the screen, and just
listen to the book. Many blind Android users love that, and find it superior to
reading with a screen reader. However, the Kindle app doesn’t even satisfy that.
Whenever the screen times out and locks, after that page is finished, the page
is turned, but the speech stops. You have to unlock the screen, and then press
“start continuous reading” again.
Now, if TalkBack could read the book, and turn the page, the experience would be
much better. But Google’s accessibility framework has moved at a glacial pace,
throughout the ten or fifteen years of Android, and iOS, development. While
Apple opened up API’s to developers, so that VoiceOver could turn pages while
reading, Google has not even added that feature to their own reading app.
Instead, Play Books uses a web view, and just detects when the user has gone
beyond the last element on the page, and then just turns the page. At least,
that’s what I think is happening. I obviously don’t have access to the source
code of the Play Books app.
Mortal Kombat
Games are becoming more and more important in the mobile ecosystem. Mobile games
are, in some cases, more popular than console games. But mobile games are
sometimes very hard to make accessible. Take the Mortal Kombat game. You have an
interface where you choose a game mode, make a team of fighters, upgrade cards,
and change settings. Then, you have the fight mode, where you tap to attack,
swipe for a special attack, and hold two fingers on the screen to block. On iOS,
the developers have made the buttons visible to VoiceOver, and added labels to
them. They’ve shown the text elements, where you “tap to continue”, to
VoiceOver, and allowed the double tap to advance to the next screen. That part,
I believe, could be done on Android as well.
The real fun is in the battles, though. Once a fight starts, on iOS, VoiceOver
is pushed out of the way, so to speak, by a direct touch area. This allows taps
and swipes to be sent directly to the app, so that I can play the game. While
I’m fighting, though, the game sends text prompts to VoiceOver, like “swipe up,”
or “Tap when line is in the middle.” I’m not sure exactly what the last one
means, but “swipe up” is simple enough. This allows me to play, and win,
battles.
Unfortunately for Android users, though, this “direct touch area” is not
possible. Google has not added this feature for app developers to take advantage
of. They theoretically could, but they’d then have to make an accessibility
service for the app, and then make sure that the service is running when the app
runs. Users are not going to turn on an accessibility service for a game, and
developers are not going to spend time dealing with all that for the few blind
people, relatively speaking, on Android.
Catching the Apple
Google, for the last few years, has been trying hard to catch up to Apple. They
have a long way to go. Apple, however, hasn’t stayed still. They have a decade
worth of built-up experience, code, frameworks, and blind people who, each time
they try Android, find that it falls short and come back to iOS. I’m not saying
Apple is perfect. And each time a wave of blind people try Android, a few find
that it works for what they need a phone for.
As more and more blind people lean into using a phone as their primary computing
device, or even their secondary computing device, accessibility is going to be
more important than ever. We can’t afford half-baked solutions. We can’t afford
stopgap measures. Companies who build their services on top of these platforms
will do what they can to make their apps accessible, but they can only do so
much. In order to make better apps, developers need rich, robust API’s and
frameworks. And right now, that’s Apple. And I’ve gotten tired of holding my
breath for Google. I’m just going to let out that breath and move on. I’ll
probably keep my Android phone around, but I’m not going to use it as my primary
device until Google gets their act together.
Some Android users will say that I’m being too harsh, that I’m not giving Google
enough time, or that I’m being whiny, or radical, or militant. But it took Google ten or so years to add
commands that used more than one finger. It took them ten years to add Braille
support to their screen reader. It took them ten years to add spell checking.
I’m not going to wait another ten years for them to catch up to where Apple was
a good three years ago.
Over the past few years, I’ve seen something that kind of troubles me. While people on iPhones Write books on using the iPhone on their iPhones, clear out their Email on their Apple Watch and manage the rest on their iPhones, and use their iPhones as their primary computing devices, Android users feel like one cannot be productive on any mobile system. So, here’s the thing. When you are around sighted people, even at a job sometimes, what are they using? Their computer? No. They’re on their phone. Maybe it’s an iPhone, or perhaps it’s an Android; it doesn’t matter. Either way, people are doing all kinds of things on their phones. When you go to a center of blind people, what do you see? People on their computers? Sometimes, but for younger people, they’re on their iPhones.
I’ll talk about the differences between iPhone or Android later. But this cannot be understated. The phone is now, for the majority of sighted, and even blind, people, their main computing device. And even a few older blind people I’ve talked to, they would rather not use a computer now. They’re all over their iPhone. So, what does this kind of productivity look like?
Quick flicks are best
Fast access to information is important. But being able to act on that information is even more significant. If I can’t quickly archive an email, I may not mess with using a Mail app that much. I want to get through my inbox, quickly, able to read through threads of info. The iPhone does this well, allowing me to flick up or down, double tap, and the email is out of sight. Within a conversation, I can move to the previous or next message, and archive, reply, or flag an individual message in that conversation. On Android, in Gmail, I can act upon a conversation, but inside a conversation, there are no quick actions. One must navigate through the message, along with any quoted or signature text, find a button, and double tap. Yes, there are other mail clients. Aqua mail comes close to being like the iPhone mail app. But it has no actions, and if one can get an honestly great mail client out of an iPhone without needing to buy another app, why should someone consider Aqua mail and Android?
A Book on a phone on a phone
I can’t get over how good Ulysses for iOS and macOS is. While I’m using Ulysses for Mac right now, I still consider what a person was able to make with just an iPhone, an app, and a Bluetooth keyboard. You may then say, “Well, if you’ve got a keyboard, you might as well have a laptop.” To which I would show a marvelous invention, called the pocket. A phone in your pocket, headphones in your ears, a keyboard in your lap (particularly one of those slim Logitech keyboards), and you’ve got a nice writing environment that is much less bulky than a laptop. A laptop with its trackpad and screen adding weight and thickness, along with the CPU and hard drive.
Next is the app. I’ve tried a lot of writing apps on Android. From iA Writer to a lot of Markdown note apps, I looked for a replacement for Ulysses that would give me the power that allowed a blind person to write an entire, large book on his iPhone. And I couldn’t find it. From unlabeled buttons, to no way to navigate by heading or link inside the document, to no way to link chapters together and export as a book, none of the apps were viable. This is not to imply that no app will exist in the future. And this does not imply that Android will not have a good enough accessibility framework to allow the creation of such apps later on. But right now, the iPhone, the most locked down operating system in the mobile space, has allowed a level of creativity from a writer which was before only seen on Windows beforehand. Furthermore, it allows a far more accessible writing environment, enabled by Markdown.
Android, meanwhile, is still trying to get dictation without TalkBack speaking over the person dictating, or Google Assistant without TalkBack loudly talking over it, phone calls where you don’t hear “keypad button” at the start of each call, image descriptions, a pronunciation dictionary, and so on. This isn’t to imply that the iPhone and VoiceOver are perfect. They are not, and amass bug after bug with every release. But, as of now, the iPhone is still the most productive platform. Android is coming around, quickly speeding up over the last year or so. I really hope it gets to the point where we can not only write books on our phone, but can also create apps, music, edit audio and video efficiently and effectively. At least, I’d love to be able to do any office work a job may require, with our phones hooked up to USB-C docking stations and keyboards and external displays.
More than likely, though, VoiceOver on the iPhone will continue to decline. TalkBack will reach where VoiceOver is right now, and stop because they ran out of ideas. The blind community will continue having to come up with workarounds, like not using the Notification Center when a Braille display is connected, or using Speak Screen on older iPhones from 2020 because VoiceOver is so badly optimized that it runs out of memory while reading an online article. Meanwhile, TalkBack will gain image descriptions, and it’ll be more than “gift card,” “blue sky,” on an app where you clock in and out of work, which is what VoiceOver does. TalkBack already speaks the text of the button, rather than describing the button. Yes, the button is unlabeled.
But the thing that really holds the iPhone up is the apps. Lire for RSS, Overcast for podcasts, Ulysses for writing, Seeing AI for recognition, and so on. And there’s an actual website with lists of apps for iOS. Android has podcast apps, RSS apps, writing apps, and recognition apps. And some, like Podcast Addict and Feeder, are great apps. But they don’t approach the accessibility of their iOS counterparts. Podcast Addict, for example, has the following layout when viewing episodes of a podcast: “Episode name, episode name button, and contextual menu Botton. Overcast, on the other hand, simple has a list of episodes. Android pros get around this by saying one should just feel one’s way down the screen, and scroll forward. What if one is using a Braille display or Bluetooth keyboard? What if one is both blind and lacks dexterity in the hands, so they need to use switch access? This is the kind of thing that iOS already has: a good, clean user interface. Sure, right now, it’s fallen into disrepair. Sure, you’ve got bugs crawling out from the walls. Sure, it feels sluggish on iPhones from just two years ago. But it’s still the best we have.
And this is where a sighted person cannot understand. To them, an iPhone X R is as good as the latest Galaxy phone, or even the latest iPhone, not mentioning the camera. Developers plan for sighted use. They make sure things look good, and flow smoothly, from the CPU on up to the touch screen. And yet, things work so differently to blind people. Adding a podcast episode to the queue may take a simple swipe on Android, but takes several swipes and taps for a blind Android user. And that’s why productivity, a good accessibility framework, apps development tools that automatically make a view as accessible as possible, and a good, high-quality screen reader are so important. And it takes all of that for a blind person to be productive, and that’s why most blind people in developed countries choose iPhone, every time.