On Braille Access and Live Captioning

If you’ve followed my blog for any length of time, you probably know that I have significant hearing loss. I could likely even be classified as functionally deaf. This has presented challenges not only in likely spaces like restaurants, on the bus, or even at work, but also sometimes at home. The issue there is that sometimes I’m trying to hear someone over a television or just the way the room is designed causes an echo that interferes with my hearing aid’s ability to clearly perceive the sound. This is stressful, to say the least.

So color me very excited to discover that in Apple’s latest update to iOS 26, they included a feature called Live Captioning. I think there is a Live CAptioning option that provides visual captions usable by those who are deaf but have sight, but the version I’m going to talk about is built specifically for those of us who are blind and also have hearing issues. Yay in itself is a revolution as even in most accessibility regulations they rarely consider persons with multiple disabilities. They have rules for blind folk, rules for deaf folk, etc, but not often an acknowledgment of the potential crossover, which I dare say is more common than we might think.

Anyhow, if you don’t know how to access Live Captioning and want to, here’s how. First, you have to have a Braille display of some kind; it can either be with a Perkins-style keyboard or using a QWERTY keyboard with something like the APH Mantis, which is what I’m using. If using a Perkins keyboard, access Braille Access by pressing dots 7 and 8 with Space bar. And all you need to do to envoke this feature with the Mantis in most cases is to press VO (Control+Option) with Y to enable Braille input, then press the letter A and semicolon as your dots 7 and 8 along with the Space bar. There are other items in the Braille Access menu that appears, which operates separate of your iPhone so you could use VoiceOver’s speech to view other content while it is open. It lets you, for example, take notes, view the time, or use Braille to open other apps.

But I’m just pressing up arrow (Space with dot 1 if using a Perkins keyboard) to navigate to Live Captioning and pressing Enter. You also have two options with this feature open, changed easily by pressing dots 7 (or the letter A with a QWERTY keyboard) plus Space bar: listen to audio or listen to microphone. Audio refers to the audio generated by the iPhone, and if you do this you can have it pick up spoken words even without the phone being turned up. I used this to, for instance, follow play-by-play of a Carolina Panthers game via the display while in a car with others (they beat the Dallas Cowboys on the Sunday prior to this entry’s writing, so I was thrilled as they’ve done virtually nothing over the last five years, but I digress). I’ve also used i to make sure I can hear narrators in an audiobook, as I find that after listening to their voice for a while as I also view the transcript I can actually understand them more.

If set to listen through the microphone, it’s as if the iPhone is recording. This means that you do not really have use of any other audio on the phone while in such mode. But the beauty of this is it will pick up conversations and other things, such as the TV, happening around me. I used this to speak to a coworker whom I’ve usually had a hard time hearing, just because of how things sound in our office and the fact that we’re positioned relatively far apart. I’ve also used it to chat with my father-in-law about work, and that went smoothly.

There is, naturally, a slight delay in between the words being spoken and flashing on the screen, and if you fall behind in reading the transcript, indicated by a braille box on either end of the display, you can just press a cursor routing button to jump to the end. And of course, the transcription is not always totally accurate, though it usually makes enough sense for me to get the gist of what was said. It can also be more challenging to follow if you have lots of crosstalk going on, as it doesn’t identify if a different character is speaking. But it was/is helping me to pick up at least a degree more than I had before, and every little bit helps. This is especially important as I continue to gain responsibility and standing in my career, not to mention that it’ll likely make family social gatherings not feel as difficult. So I applaud Apple for taking such a step. Let’s hope it’s only the beginning of what can be done to get this tech to truly help us all.

Leave a Reply

Your email address will not be published. Required fields are marked *