Live Caption Addendim: A Hearing Aid Horror Story

I guess I should begin by noting that this happened to me because of my own hard-headedness. I should have gone in to have the aids maintained long before it got to that point, but it’s always hard to find a stopping point in busy life and work. I guess next time I’ll just have to schedule the appointment six months out and aim to make it there for the routine retubing. Anyway, here’s what happened.

Have you ever had one thing happen, then another thing immediately follows and something else follows on that until it feels like catastrophic failure… then you wake up? And breathe a huge sigh of relief? “Man thank goodness, I had no idea how I was getting out of that mess!”

Well this incident felt like that, except unfortunately it was not “but a dream.”

Monday night. My wife and I were wrapping up chatter about our day and about to pack it in for the night, me laying on my side and exerting pressure on the left-side hearing aid. Suddenly I realized the sound had become muffled. “Oh no, not again” I thought. “I know this stupid thing hasn’t become blocked by wax again.” Only when I reached my hand up to investigate, I discovered that the tubing had actually become severed from the earhook. Worse, it had split slightly at the top, meaning there was no way I could reconnect the pieces. My stress meter immediately shot up, and my brain whirred as it tried to come up with at least a minimal fix that could at least let me get to the clinic the following day. Figuring that since my left ear was the most important to hear from, I decided to connect my left-side aid to the right-side tubing so that I could at least nominally communicate. And then I went to bed, restless.

Tuesday morning: I awoke, called out of work, and prepared to settle into my recliner to await the clinic’s opening at 8:00 AM. As I turned on the SiriusXM Real Jazz station I enjoy waking up to and opened the book I was reading on my Mantis Braille display, I happened to reach up and touch the left-side hearing aid for some reason. And then that tubing, which had previously been attached to the right-side aid, also broke. It too would no longer stay attached to the ear hook. This meant, one of my greatest fears, I was completely unable to hear. By this point my wife had already departed for work, which was fortunately only a few minutes away. I sent her a frightened text, and she said as soon as she could get the day started and let those know who needed to know she would return home to assist me.

8 AM came, and no one answered the phone right on the dot. Then 8:10, then 8:15, at which time I left a message. Finally, on my 8:25 attempt someone picked up. I had to hook my USB headphones into the iPhone and turn them up to full volume, thus just being able to make out what was being said. The receptionist said a graduate assistant could see me at 10:30 AM, and I instantly felt much better.

So I put on my clothes, sat downstairs on the couch, and waited for my wife to show up shortly after 9:30. I had already enabled the iPhone’s Live Caption feature that I wrote about in the previous post, so I would know when she had reentered. At one point though, I saw the words “come on” scroll across my display and I thought she had arrived, but it was a false alarm. Yeah, that app, while very helpful, is far from perfect. I sure was curious what it could have heard though, in our supposedly quiet living room.

Once she did get there though, we were able to have a sort of halting conversation as the car sped toward the audiology clinic. She said it felt odd with the delay, as naturally it took me a few seconds to process what had just been said. And sometimes the transcript just isn’t correct, leading to hilarities. For some reason it often thinks people are saying expletives (the F-word especially). And at this point it doesn’t distinguish among voices, meaning what I’m seeing could be coming from a person or the radio. I hope that can be improved in future using AI. Still, it was truly a game changer, and perhaps even a lifesaver, in that kind of situation. The last time I lost pretty much all of my hearing due to an ear infection and needed quick treatment they just had to get started without asking me any questions, which can be dangerous.

In the clinic, the audiologist-in-training was also able to communicate with me, even as she repaired the aids. Usually, I’d just be sitting there waiting for them to plug the left-side aid in so I could hear them. It was great though, knowing what was happening in real time, well sort of real-time.

After that harrowing experience, I was more than happy to take my wife up on a breakfast at a local diner called Brigs. There are three locations in the Triangle, and they serve breakfast until early afternoon. The atmosphere was nice, and at the time we got there it wasn’t too loud, so I could actually communicate with the server. They had large and small cups of orange juice, and I wasn’t sure which to get because how does one define large versus small. I went with the former and was glad I did, as it was only the size of a normal glass. Along with that, I had a delicious sausage and cheese omelet and fruit. She chose to eat blueberry pancakes. It was nice just enjoying a rare meal together in the middle of a workday, and finally, happily, letting all of that stress melt away. Until my next misadventure.

On Braille Access and Live Captioning

If you’ve followed my blog for any length of time, you probably know that I have significant hearing loss. I could likely even be classified as functionally deaf. This has presented challenges not only in likely spaces like restaurants, on the bus, or even at work, but also sometimes at home. The issue there is that sometimes I’m trying to hear someone over a television or just the way the room is designed causes an echo that interferes with my hearing aid’s ability to clearly perceive the sound. This is stressful, to say the least.

So color me very excited to discover that in Apple’s latest update to iOS 26, they included a feature called Live Captioning. I think there is a Live CAptioning option that provides visual captions usable by those who are deaf but have sight, but the version I’m going to talk about is built specifically for those of us who are blind and also have hearing issues. Yay in itself is a revolution as even in most accessibility regulations they rarely consider persons with multiple disabilities. They have rules for blind folk, rules for deaf folk, etc, but not often an acknowledgment of the potential crossover, which I dare say is more common than we might think.

Anyhow, if you don’t know how to access Live Captioning and want to, here’s how. First, you have to have a Braille display of some kind; it can either be with a Perkins-style keyboard or using a QWERTY keyboard with something like the APH Mantis, which is what I’m using. If using a Perkins keyboard, access Braille Access by pressing dots 7 and 8 with Space bar. And all you need to do to envoke this feature with the Mantis in most cases is to press VO (Control+Option, or Control+Windows if you are more Windows inclined) with Y to enable Braille input, then press the letter A and semicolon as your dots 7 and 8 along with the Space bar. There are other items in the Braille Access menu that appears, which operates separate of your iPhone so you could use VoiceOver’s speech to view other content while it is open. It lets you, for example, take notes, view the time, or use Braille to open other apps.

But I’m just pressing up arrow (Space with dot 1 if using a Perkins keyboard) to navigate to Live Captioning and pressing Enter. You also have two options with this feature open, changed easily by pressing dots 7 (or the letter A with a QWERTY keyboard) plus Space bar: listen to audio or listen to microphone. Audio refers to the audio generated by the iPhone, and if you do this you can have it pick up spoken words even without the phone being turned up. I used this to, for instance, follow play-by-play of a Carolina Panthers game via the display while in a car with others (they beat the Dallas Cowboys on the Sunday prior to this entry’s writing, so I was thrilled as they’ve done virtually nothing over the last five years, but I digress). I’ve also used i to make sure I can hear narrators in an audiobook, as I find that after listening to their voice for a while as I also view the transcript I can actually understand them more.

If set to listen through the microphone, it’s as if the iPhone is recording. This means that you do not really have use of any other audio on the phone while in such mode. But the beauty of this is it will pick up conversations and other things, such as the TV, happening around me. I used this to speak to a coworker whom I’ve usually had a hard time hearing, just because of how things sound in our office and the fact that we’re positioned relatively far apart. I’ve also used it to chat with my father-in-law about work, and that went smoothly.

There is, naturally, a slight delay in between the words being spoken and flashing on the screen, and if you fall behind in reading the transcript, indicated by a braille box on either end of the display, you can just press a cursor routing button to jump to the end. And of course, the transcription is not always totally accurate, though it usually makes enough sense for me to get the gist of what was said. It can also be more challenging to follow if you have lots of crosstalk going on, as it doesn’t identify if a different character is speaking. But it was/is helping me to pick up at least a degree more than I had before, and every little bit helps. This is especially important as I continue to gain responsibility and standing in my career, not to mention that it’ll likely make family social gatherings not feel as difficult. So I applaud Apple for taking such a step. Let’s hope it’s only the beginning of what can be done to get this tech to truly help us all.