iOS 26 and VoiceOver/Braille Access/live listen had a good run yesterday. I was at a workshop and I was able to use live listening on my BI40XB Braille display 2for hours solidly reading in Braille what the various speakers were talking about. I could go back to the beginning of the whole workshop, jump anywhere in the text/Braille, and then zip back to the current words that were being spoken/translated. I would say the transcription of voice to text/Braille was about 98% accurate. This will make it a lot more accessible for people who are Blind deaf with no useful vision at all living, working/playing in a cited worldspoken without having to have someone there all the time , interpreter. After all this is what we all strive for, complete independence and not relying on other people to get things done. I spoke to a friend of mine overseas early this morning he is completely deaf and completely blind, and he has been using iOS 26 with this function extremely effectively when he goes out in the community and needs to chat to people. Pretty incredible stuff when you think about it. You can also get a transcript coppied to/paste, and get a AI summary of the captured information.
reshared this