So this is in response to This blog post regarding Mr. Beast’s Blindness video, which shows the perspective of a person that still has some remaining vision. I, however, have none. I am completely blind. I wanted to write a response that shows my perspective on the video about the possibility of regaining sight. I speak for myself, not for anyone else in the blind community or culture. I’ll also talk about the need of a Blind culture, and the cheapening of such culture by these types of videos.
The video, titled 1000 Blind people See for the First Time was hard for me to watch, in several ways. Firstly, the title is a lie. I’ll go through that later. Second, a lot of it is visual in nature, with very little description. Third, it was not described, using any kind of audio description. Audio description is where a video has another audio track that plays alongside the original, where a person is describing the events that happen during the video. Not just the text, or the main idea as in this video, but the people, places, actions in the video. Sometimes, Youtube videos have “separate but equal” versions with descriptions, but I tried searching for it, but could not find it, using my admittedly slower Mac with Safari and VoiceOver. Still, if this video was about blind people, it should be suitable for blind people as well. It reminds me of the accessibility overlay companies, which will gladly post on social media images without descriptions.
The Big Lies
Let’s pick apart the title of the video. I’m not going to concern myself with 1000. That seems cruel to me, just picking an arbitrary number like that, but let’s move on. “Blind people.” Were these people blind? No, they weren’t. In the culture I live in, they would be called “low vision” or “visually impaired.” Mr. Beast cured people with cataracts. In the worst circumstance, maybe that could mean blind. But in the video, there were people that had been blind for four months. Imagine having a disability for four months. Comparing that to my life of absolutely no vision is very harmful, and cheapens the lives of those who are actually blind, and makes me feel as if I’m not even worthy to be called blind, or that my experiences are absolutely worthless, that my work is worthless, and that my life is meaningless. Let’s move on to the next part. “For the First Time.” Again, a lot of these people could see before. Maybe they couldn’t see perfectly, but their eyes could perceive enough to live mostly normal lives. They could see people’s faces, with glasses. In fact, one of the people in the video said something like “Well I don’t need these anymore.” I don’t know for sure, because again the video wasn’t described, but I’m pretty sure she was referring to glasses. You know what? If I could see well, even enough to not need a screen reader, I’d happily, happily take glasses over what I have now. If I needed glasses and a screen magnifier to see a computer or television screen, I’d take that in a heartbeat.
The cheapening of Blind Culture
The next time you meet a Deaf person, I want you to ask them how they would respond if a person with some hearing loss approached the Deaf person, and told them that they, the person with hearing loss, was also Deaf. Now, I don’t know any Deaf people personally, and am not Deaf myself. But I’m pretty sure it wouldn’t go over well, to say the least. The reason I think this is that I never hear any hard of hearing people call themselves Deaf. Why? Because their experience is not the same as a Deaf person. The same applies to Blind people, if indeed we want to be a stronger culture. We must be allowed to have our own words, our own experiences, our own culture. To do otherwise is to weaken and cheapen the bond shared by all totally blind people.
If, instead, we allow videos like this to claim our words, our terms, and our experiences, we’ll need to retreat to where Autistic people are now, having to call themselves “Actually Autistic.” Why? Because the broader culture claimed their word and their way of finding and talking about one another, and explaining themselves. So now they have to use another hashtag to show who they actually are. And honestly, we have so little power as it is. If we allow the term “blind” to be used for those who have usable vision, the general population will then think that there are no people that can’t see at all, and when we tell them that we cannot see, they’ll think we’re lying. This already happens sometimes to me. And it’s yet another slap in the face.
And here we get to the biggest problem with this video: its effect on the general population. Now, if people don’t look any further, which many conservatives have proven that they will not, people will think that blindness is curable in 10 minutes, which it’s not. There are so many causes of blindness, and so many do not have cures. A lot of blind or low vision people do not want to be cured, fine with the life they live or the vision that they do have. And now, sighted people have a video to point to. “Dude it’s just ten minutes,” they’ll say. “Are you so anti-vaxxxxx that you won’t even stop being a burden on society?” Parents will guilt-trip their kids. Husbands will guilt-trip their wives. And for what? A cure that will more than likely not apply to them.
And what of people, like me, who want to even see as low vision people do? What of people who would give much for a cure, are shown this video and asked “Hey dude, check your mail! Maybe you were one of the 1000! Maybe you won!” Just another slap in the face. An undescribed video, an arbitrary number, a single cure, for those who are not actually blind.
Another aspect of this is that we do want to experience the world. Why else would we want pictures described, or to be able to watch videos or television, or play video games? Yes, we have our own culture. We have our audio books, text to speech voices, audio games, and screen readers. But we want to know the sighted culture too. Sure, some may not want to see, enjoying who they are. Others beg their god or science to be able to see . But whatever way we experience the world, either through sight, visual interpretation, or reading books about the world, we love experiencing it. Even those who do not want to see the world still live in it. But this video, with its outright lies and false hope and capitalistic choosing of just 1000 people, doesn’t help anyone. From the low vision people that were “left behind,” to the blind people for whom there is no cure, to the general public who will now have another excuse to shun us, it does more harm, I think, than good.
Over the past few years, I’ve seen something that kind of troubles me. While people on iPhones Write books on using the iPhone on their iPhones, clear out their Email on their Apple Watch and manage the rest on their iPhones, and use their iPhones as their primary computing devices, Android users feel like one cannot be productive on any mobile system. So, here’s the thing. When you are around sighted people, even at a job sometimes, what are they using? Their computer? No. They’re on their phone. Maybe it’s an iPhone, or perhaps it’s an Android; it doesn’t matter. Either way, people are doing all kinds of things on their phones. When you go to a center of blind people, what do you see? People on their computers? Sometimes, but for younger people, they’re on their iPhones.
I’ll talk about the differences between iPhone or Android later. But this cannot be understated. The phone is now, for the majority of sighted, and even blind, people, their main computing device. And even a few older blind people I’ve talked to, they would rather not use a computer now. They’re all over their iPhone. So, what does this kind of productivity look like?
Quick flicks are best
Fast access to information is important. But being able to act on that information is even more significant. If I can’t quickly archive an email, I may not mess with using a Mail app that much. I want to get through my inbox, quickly, able to read through threads of info. The iPhone does this well, allowing me to flick up or down, double tap, and the email is out of sight. Within a conversation, I can move to the previous or next message, and archive, reply, or flag an individual message in that conversation. On Android, in Gmail, I can act upon a conversation, but inside a conversation, there are no quick actions. One must navigate through the message, along with any quoted or signature text, find a button, and double tap. Yes, there are other mail clients. Aqua mail comes close to being like the iPhone mail app. But it has no actions, and if one can get an honestly great mail client out of an iPhone without needing to buy another app, why should someone consider Aqua mail and Android?
A Book on a phone on a phone
I can’t get over how good Ulysses for iOS and macOS is. While I’m using Ulysses for Mac right now, I still consider what a person was able to make with just an iPhone, an app, and a Bluetooth keyboard. You may then say, “Well, if you’ve got a keyboard, you might as well have a laptop.” To which I would show a marvelous invention, called the pocket. A phone in your pocket, headphones in your ears, a keyboard in your lap (particularly one of those slim Logitech keyboards), and you’ve got a nice writing environment that is much less bulky than a laptop. A laptop with its trackpad and screen adding weight and thickness, along with the CPU and hard drive.
Next is the app. I’ve tried a lot of writing apps on Android. From iA Writer to a lot of Markdown note apps, I looked for a replacement for Ulysses that would give me the power that allowed a blind person to write an entire, large book on his iPhone. And I couldn’t find it. From unlabeled buttons, to no way to navigate by heading or link inside the document, to no way to link chapters together and export as a book, none of the apps were viable. This is not to imply that no app will exist in the future. And this does not imply that Android will not have a good enough accessibility framework to allow the creation of such apps later on. But right now, the iPhone, the most locked down operating system in the mobile space, has allowed a level of creativity from a writer which was before only seen on Windows beforehand. Furthermore, it allows a far more accessible writing environment, enabled by Markdown.
Android, meanwhile, is still trying to get dictation without TalkBack speaking over the person dictating, or Google Assistant without TalkBack loudly talking over it, phone calls where you don’t hear “keypad button” at the start of each call, image descriptions, a pronunciation dictionary, and so on. This isn’t to imply that the iPhone and VoiceOver are perfect. They are not, and amass bug after bug with every release. But, as of now, the iPhone is still the most productive platform. Android is coming around, quickly speeding up over the last year or so. I really hope it gets to the point where we can not only write books on our phone, but can also create apps, music, edit audio and video efficiently and effectively. At least, I’d love to be able to do any office work a job may require, with our phones hooked up to USB-C docking stations and keyboards and external displays.
More than likely, though, VoiceOver on the iPhone will continue to decline. TalkBack will reach where VoiceOver is right now, and stop because they ran out of ideas. The blind community will continue having to come up with workarounds, like not using the Notification Center when a Braille display is connected, or using Speak Screen on older iPhones from 2020 because VoiceOver is so badly optimized that it runs out of memory while reading an online article. Meanwhile, TalkBack will gain image descriptions, and it’ll be more than “gift card,” “blue sky,” on an app where you clock in and out of work, which is what VoiceOver does. TalkBack already speaks the text of the button, rather than describing the button. Yes, the button is unlabeled.
But the thing that really holds the iPhone up is the apps. Lire for RSS, Overcast for podcasts, Ulysses for writing, Seeing AI for recognition, and so on. And there’s an actual website with lists of apps for iOS. Android has podcast apps, RSS apps, writing apps, and recognition apps. And some, like Podcast Addict and Feeder, are great apps. But they don’t approach the accessibility of their iOS counterparts. Podcast Addict, for example, has the following layout when viewing episodes of a podcast: “Episode name, episode name button, and contextual menu Botton. Overcast, on the other hand, simple has a list of episodes. Android pros get around this by saying one should just feel one’s way down the screen, and scroll forward. What if one is using a Braille display or Bluetooth keyboard? What if one is both blind and lacks dexterity in the hands, so they need to use switch access? This is the kind of thing that iOS already has: a good, clean user interface. Sure, right now, it’s fallen into disrepair. Sure, you’ve got bugs crawling out from the walls. Sure, it feels sluggish on iPhones from just two years ago. But it’s still the best we have.
And this is where a sighted person cannot understand. To them, an iPhone X R is as good as the latest Galaxy phone, or even the latest iPhone, not mentioning the camera. Developers plan for sighted use. They make sure things look good, and flow smoothly, from the CPU on up to the touch screen. And yet, things work so differently to blind people. Adding a podcast episode to the queue may take a simple swipe on Android, but takes several swipes and taps for a blind Android user. And that’s why productivity, a good accessibility framework, apps development tools that automatically make a view as accessible as possible, and a good, high-quality screen reader are so important. And it takes all of that for a blind person to be productive, and that’s why most blind people in developed countries choose iPhone, every time.
For years now, Google has been seen, for good reasons I’d say, as moving very slowly with accessibility. TalkBack would get updates in fits and starts, but otherwise didn’t seem to have people that could devote much time to it. Starting a few years ago with multi-finger gestures, TalkBack development began picking up steam, and to my surprise and delight and relief, it has not slowed down. They seem to spend as much time resolving issues as they spend creating new features and experiences. This was highlighted in the new TalkBack update that began rollout on January 9.
On that day, there was a TalkBack update from Google (not Samsung) which bumped the version to TalkBack 13.1. New in this version is the ability to use your HID Braille display over USB. Support for Bluetooth will come when Android has Bluetooth drivers for them. That alone is worth an update. But there’s more! New in TalkBack is the ability to spell check messages, notes, and documents. That alone was worth two major iOS updates to complete. But there’s more! Now, we can use actions the same way iOS does. That alone would have been worth several updates. Now, we have many more languages available for Braille users. We can now switch the direction of panning buttons. On the Focus braille display, the right whiz-wheel type buttons now pan, giving two ways to pan text. We can now move from one container to another, just like in iOS.
Now, I know that was a lot of info, in just a minor version bump. So let’s unpack things a bit. I’ll describe the new features, and why they impress me a lot more than Apple’s latest offerings.
HID Braille over USB
When TalkBack’s Braille support was shown off last year, there was a lot of talk about the displays that were left out. Displays from Humanware, which use the Braille HID standard, were not included on the list. That was mainly because there are no Android Bluetooth drivers for the displays, meaning TalkBack can’t do anything with them, over Bluetooth. However, with this update, people who have these displays, like the NLS EReader from Humanware, can plug their displays into their phone through a USB-C cable. This is made much easier because the displays work through USB-C anyway, and use them with TalkBack. This is made even simpler because Android phones already use USB-C, so you don’t need an adaptor to plug your display into your phone.
This demonstrates two things, to me. First, the TalkBack team is willing to do as much as they can to support these new displays and the new standard. I’m sure they’re doing all they can to work with the Bluetooth team to get a driver made into Android 14 or 15. Second, even if the wider Android team doesn’t have something ready, the accessibility team will do whatever they can to get something to work. Since Braille is important, they released USB support for these displays now, rather than waiting for Bluetooth support later. But when they get Bluetooth support, adding that support for these displays should be easier and quicker.
Now, TalkBack’s Braille support isn’t perfect, as we’ll see soon, but when you’re walking down a path, steps are what matters. And walking forward slowly is so much better than running and falling several times and getting bugs and dirt all over you.
Spellchecking is finally here!
One day, I want to be able to use my phone as my only computing device. I would like to use it for playing games, writing blog posts like this one, web browsing, email, note-taking, everything at work, coding, learning to code, and Linux stuff. While iOS’ VoiceOver has better app support from the likes of Ulysses and such, Android is building what could ultimately provide many developers a reason to support accessibility. Another brick was just put into place, the ability to spell check.
This uses two new areas of TalkBack’s “reading controls”, a new control from which to check for spelling errors, and the new Actions control to correct the misspelling. It works best if you start from the top of a file or text field. You switch the reading control to the “Spell check” option, swipe up or down to find a misspelled word, then change the control to “actions” and choose a correction. iOS users may then say “Well yeah I can do that too”. But that’s the point. We can now even more clearly make the choice of iPhone or Android, not based on “Can I get stuff done?” but on “How much do I want to do with my phone?” and “How much control do I want over the experience?” This is all about leveling the field between the two systems, and letting blind people decide what they like, more than what they need.
Actions become instant
From what I have seen, the iPhone has always had actions. VoiceOver users could always delete an email, dismiss notifications, and reschedule reminders with the Actions rotor, where a user can swipe up or down with one finger to select an option, then double tap to activate that option. This allows blind people to perform swipe actions, like deleting a message, liking a post, boosting a toot, or going to a video’s channel. Android had them too, they were just in an Actions menu. Unless you assigned a command to it, you had to open the TalkBack menu, double tap on Actions, find the action you wanted, and then double tap. Here are the steps for a new Android user, who has not customized the commands, to dismiss a notification through the Actions menu:
Find the notification to be dismissed
Tap once with three fingers to open the TalkBack menu.
Double tap with one finger to open the Actions menu.
Swipe right with one finger to the “Dismiss” option.
Double tap with one finger.
Now, with the new Actions reading control, here’s how the same user will dismiss a notification:
Find the notification.
Swipe up with one finger to the “dismiss” option.
Double tap with one finger.
This action is one that users perform hundreds of times per day. This essential task has been taken down from five steps, to three. And, with TalkBack’s excellent focus management, once you dismiss a notification, TalkBack immediately begins speaking the next one. So to dismiss the next one, you just swipe up with one finger, then double tap again. It’s effortless, quick, and is delightfully responsive.
On Android, since actions have been rather hidden for users, developers haven’t always put them into their app. Of course, not every app needs them, but it would help apps like YouTube, YouTube Music, Facebook, GoodReads, PocketCasts, Google Messages, WhatsApp, Walmart, and Podcast Addict, to name a few. It will take some time for word of this new ability to spread around the Android developer space. For Android developers who may be reading this, please refer to this section on adding accessibility actions. That entire page is a great resource for creating accessible apps. It describes things clearly and gives examples of using those sections in code.
Interestingly, the other method of accessing actions is still around. If you have an app, like Tusky, which has many actions, and you want to access one at the end of the list, you can still open the Actions menu, find the action you want, and double tap. In Android, we have options.
New Languages and Braille features
One of the critical feedback from users of Braille support is that there were only about four languages supported. Now, besides a few like Japanese and Esperanto, we have many languages supported. One can add new Braille languages or remove them, like Braille tables in iOS, except everyone knows what a language, in this instance, means, but very few know what a Braille table is. That goes into the sometimes very technical language that blindness companies use in their products, from “radio button” to “verbosity” which I should write about in the future. For now, though, Google named its stuff right, in my opinion.
In the advanced screen of Braille settings, you can now reverse the direction of panning buttons. I never liked this, but if someone else does, it’s there. You can also have Braille shown on the screen, for sighted users or developers.
For now, though, if you choose American English Braille, instead of Unified English Braille, you can only use Grade one Braille, and not Grade two. However, computer Braille is now an option, so you can finally read NLS BARD Braille books, or code in Braille, on your phone. This brings textual reading a step closer on Android!
Faster and cleaner
Speed matters. Bug fixes matter. In TalkBack 13.1, Google gave us both. TalkBack, especially while writing in the Braille onscreen keyboard, is somehow even more snappy than before. That bug where if you paused speech, TalkBack from then on couldn’t read passed one line of a multi-line item, is gone. TalkBack now reads the time, all the time, when you wake up your phone as the first thing it says.
Meanwhile, if I have VoiceOver start reading a page down from the current position, it stops speaking for no reason. iOS feels old and sluggish, and I don’t feel like I can trust it to keep up with me. And I just want Apple to focus on fixing its bugs rather than working on new features. They spent resources on technology like that DotPad they were so excited about, but no blind people have this device, while their tried and true Braille display support suffers. Yeah, I’m still a bit mad about that.
The key takeaway from this section is that perhaps real innovation is when you can push out features without breaking as much stuff as you add. For blind people, a screen reader isn’t just a cool feature, or a way to look kind in the media, or a way to help out a small business with cool new tech. It’s a tool that had better be ready to do its job. Blind people rely on this technology. It’s not a fun side project, it’s not a brain experiment. It’s very practical work, that requires care for, often, people who are not like you.
Luckily, Google has blind people that work for them. And, if the past year has shown an example, they’re finally getting the resources, or attention, they need to really address customer feedback and provide blind Android users with what will make Android a great system to use.