Skip to main content

Search

Items tagged with: VoiceOver


Calling All Blind and Low Vision Users!

Exciting news! I’ve reached out to Apple Accessibility, and they’re on board to collaborate with us. They’ve asked me to put together a team so we can work directly with their team and engineers.

Are you passionate about improving accessibility features for blind and low vision users? Do you use VoiceOver, Zoom, or other low vision features on your devices? We need your help!

We’re looking for individuals who are interested in providing feedback and testing new features from our point of view. Your input will be invaluable in making technology more accessible for everyone.

If you’re interested in participating, please fill out the form below:

forms.microsoft.com/r/eRQAsmMb…

Let’s make a difference together!
Feel free to share this post.

Best,
Matthew Whitaker

#Accessibility #BlindUsers #LowVision #VoiceOver #Zoom #AppleAccessibility #TechForAll #Inclusion #AccessibilityMatters #blind #lowvision


Doe anyone know if #vmware and #voiceover are still broken in this version?


Does anyone know any books or guides about #screenreaders that talk about how the screenreader turns the computer's infornation into speach. I also want to know about common issues that can happen, to make elements and text not readable or interactable. Looking specifically for #nvda content, but would be intested about other screenreaders as well like #voiceover. #blindness #tech #curious


In light of the iOS 18 improvements to Braille Screen Input, I tried getting back into it on my iPhone 15 Pro that still runs iOS 17.5.1 release. The problem I have always had with BSI is that my fingers and it somehow never went together well. I normally have no trouble using touch screens, but BSI somehow always alluded me. And it all starts with the fact that the calibration gesture itself seems to fail me more often than it works. And it doesn’t seem to remember well the last way I used BSI, judging from the many dots that get mis-interpreted by my touches once it gets re-engaged. So, is there something I may be missing? Or a special trick to get the finger calibration to work?

#blind #VoiceOver #iOS #Braille #BSI #BrailleScreenInput


Reflecting on Monday’s #WWDC keynote, I must say that it was packed! Of course, many of the AI features won't be available to me initially, not being in the U.S., but some of the other features really excited me, too. Like iPhone mirroroing or some of the writing tool enhancements. Will be curious to see how well these work with #VoiceOver. The new Braille Screen Input features some of you wrote about sound really neat! I have so far resisted the temptation to (or habit of) installing the new beta on my devices, unlike the last 10+ years. I thought about treading a bit more lightly now that I am retired. And especially first few betas can be very stressful. Let#s see if I prevail. ;-)


Anybody else running #iOS 18 beta with VoiceOver experiencing a crash as soon as opening VoiceOver settings>speech? I was thinking it was because I use ESpeak, but uninstalling it didn't seem to fix it. #Apple #VoiceOver #Accessibility


Has anyone else experienced this thing in iOS Safari where #VoiceOver will keep interrupting itself with the word "close?" It will say the first few syllables of something, and just cut itself off. Each time you swipe to the next item, VO will do it again. More and more pages are becoming unusable for me. Is there any workaround for this?


When using @MonaApp with #VoiceOver, is it possible to share an image attached to a post with another app? When I use the "view media" rotor action, and then triple-tap on what VoiceOver claims is the image, I get sharing options related to the alt text, or detected text, or something, but not the image itself. In the end I had to take a screenshot of the image and use that instead.


I've just pushed a bunch of #accessibility changes for screen readers to the main branch of FediThready. ( It makes long texts into post sized chunks)

I've run through it with #VoiceOver and it _seems_ ok. HOWEVER it all feels like it's "usable" instead of "good".

If there's a #a11y geek out there who uses screen readers regularly I would REALLY appreciate some suggestions on how to make this actually feel good for folks who rely on screen readers.

github.com/masukomi/fedithread…


I wonder if this is any more #Accessible with #Voiceover? It didn't used to be usable at all. mastodon.social/@macrumors/112…


Are there any good resources for a #Blind person just buying a #Mac? I'd like to dedicate it solely to music composition, but I don't have any experience with the OS apart from what might carry over from being a regular #iPhone and #Voiceover user. Sources on learning #LogicPro from the ground up would be great too. Thanks!


@nick #Sonos has replaced its app not because they truly think the app is better. But because they can replace specialised Android, iOS, Windows, and macOS teams with one generic team who know how to use cross-platform tools.

It goes beyond that, though. Look at the ideas behind the new home screen, which essentially can be described as: "put what you want on it". Is that primarily a user-facing improvement? No.

Rather, it's a reason to not rely on designers who can carefully think through information architecture, viewport sizes, user flows, and the best ways to present information. Make it the user's problem so that they can fire the people whose responsibility it used to be, or move them to another team where they won't be able to do their best work and will eventually quit and not be replaced.

This update goes way beyond #accessibility. It's a fundamental shift in how they do business, and it will be shit for everyone. That, more than the lack of #VoiceOver support, is what will probably cause me to move away from their ecosystem.

@x0 @simon @talon


If you were wondering whether the new #Sonos app is as bad with #VoiceOver as people said, I can confirm that it is.

The first element that receives focus has no #accessible role or name, i.e. VoiceOver doesn't announce anything for it. The screen is split up into sections, like "Recently Played", "Your Services", and "Sonos Favourites", but none of these have headings. And, as previously noted, explore by touch doesn't work; VO seems to just see that blank element I mentioned as being stretched across the entire screen.

As a result of all this, the "Search" button requires 32 swipes from the top of the screen to reach, at least with my setup. If you have more services and/or more favourites, that number of swipes will be higher. #accessibility


You know, one thing I really do like about Android, Pixel works but IDK about others. When you turn off the stupid, awful, frustrating bullcrap where you have to tell your phone to "stop", shouting over the alarm to be heard... You can then double tap with two fingers, with TalkBack, to immediately stop the alarm. No need to swipe to the stop button and double tap.

Of course, just like a lot of things in Android, the Double Tap with two fingers just sends the "play/pause" signal, so it's not really a Magic Tap that apps can really make do interesting things, like how in DiceWorld on iOS, you can Magic Tap (double tap with two fingers), to roll the dice without needing to find that button each time. Stuff like that, in apps, is really nice.

Another issue with Android is the way apps handle speech; they almost always just send output of ongoing things, like live OCR results and such, to the system TTS engine instead of TalkBack. This is mainly because that's how it's always been done, but now that Braille is an option, I really hope developers start just sending announcements directly to TalkBack. On iOS, for example, I can play DiceWorld completely in Braille because it sends all announcements to VoiceOver, and not the TTS engine. See, Android has been all about speech at all cost, coming from the days of the Eyes-free shell since TalkBack couldn't use the touch screen yet. iOS, I think, has always let apps send content to VoiceOver, so it can read whatever the dev needs it to, and thus also shows up in Braille, can easily be interupted, all that.

Just some early morning thoughts, don't come at me.

#accessibility #blind #android #iOS #TTS #VoiceOver #TalkBack


I wrote an article about how to make hastags accessible. Did some #ScreenReader testing with #JAWS, #NVDA, #VoiceOver and #Narrator, which was fun!

Pretty long one though, contains a bunch of tables for comparison. Enjoy the ride!

stevefrenzel.dev/posts/easy-we…

#accessibility #a11y #html #JavaScript #WebDev #frontend


Yesterday I started a new miniseries on my channel, in which I try to make a tutorial about a single topic in under 10 minutes. Starting with:
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver


I’ve started a new miniseries on my channel today, in which I try to make a tutorial about a single topic in under 10 minutes. Starting with:
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver


If you're a #blind #Mac user, you may have never found a use for that bizarre combination of keys which is VO+Shift+F3. this disables cursor-tracking. Let me show you why this may in fact, be one of #VoiceOver's best hidden features you didn't know about.
Why_Turning_Off_VoiceOver_Cursor_Tracking_can_be_Really_Useful:


@matt I use the Spitfire Audio downloader (for my sins) on Mac and it's 99.5% accessible with VOCR, and 1% accessible with #VoiceOver. the only reason it gets the 1% is because just like when you write your name, date and title at the top of the paper you likely get marked for good spelling, we can see the close button, but nothing else. Haha.
In this instance, you'd best know how to use VOCR before you get into their libraries, without it, you're simply out of luck.
Situations like that, sure. But that's different from an email from a supposed reputable company with important information that absolutely, without question, *needs* to be in a format that is accessible to *all* customers, irrespective of ability or disability.

You don't *need* spitfire, but if a company have suffered some kind of hack, you *do* need to know about it, you know?


Greetings!

Well, after finding out that #WhaleBird
github.com/h3poteto/whalebird-…is a decent enough accessible MacOS client for use with #Misskey, you'll see me post on here from time to time now as well! Most of the time I'll still be posting on my Mastodon account, though watch this space! Three-Thousand characters is more than i ever would need myself, but I'll take it!

For users of #VoiceOver, the 'J' & 'k' keys currently do not speak under the curssor what the post is, thus nornal VO commands for now are necessary. Definitely a client for #Blind users to check out though!

NB. As noted among the GitHub page, WhaleBird is also available for #Windows and #Linux, though I'll leave those builds to you guys!


OK, this is great. Just found an #iOS equivalent of #MacWhisper and it’s free. Not sure if it always is, but it certainly is right now.
It’s #Accessible and works well with #VoiceOver.

#Aiko by Sindre Sorhus

apps.apple.com/gb/app/aiko/id1…


Not really. It's #VoiceOver for #iOS, and different synths treat it differently, but it is not a reliable or nice experience. You may have an easy ride, but that doesn't mean everyone else will, and my post proves this. There's simply no need for this kind of frilly behaviour. Standard lettering is not only understandable by a #ScreenReader, but by a non-English speaker too, who may not recognise those letters for their so-called intended purpose.


iOS Shortcuts. Darcy Burnard, from the Maccessibility podcast, did a 10 part series on understanding and using shortcuts. If you are interested in iOS shortcuts, I would highly recommend checking out this series! You can find it on the ACB Community podcast. You can either join the podcast through your favorite podcast app, or go to the website link at the end. Now there is many different things posted to this podcast feed. But just do a search for Understanding Shorcuts, and this will filter them out. The series ran from January 23rd to April 10th. Thanks to Darcy for putting in the time and doing this series! I personally listen to them through my podcast app, but here is the website link if you want it. acb-community.pinecast.co. #iOS #Shortcuts #Tip #UnderstandingShortcuts #Blind #Voiceover @DHSDarcy


Inspired by the creative use of some nifty JAWS scripting and the power of iOS shortcuts as demonstrated by @IllegallyBlind, I have decided to try my hand at creating something similar for NVDA and I think I've succeeded. Note that I'm fairly new at this and by no means a coder so this is the simplest of simple, in fact, I'm still quite amazed that it works, actually.
What we need:
1. The NVDA Speech Logger addon available at:
github.com/opensourcesys/speec…
2. The following iOS shortcut:
icloud.com/shortcuts/999808bd1…
How to use:
1. Install both: the addon in your NVDA and the shortcut in your shortcuts respectively.
2. In NVDA's settings head over to the Speech Logger category and set the output path to your Dropbox root (that's what the shortcut assumes you're using, feel free to modify as needed);
3. Start logging the local speech with the assigned gesture (by default NVDA+alt+l);
4. Assuming the shortcut is configured properly (Dropbox authorized and all that jazz), launch it and a viewer will pop up with the fresh version of the log file at the time.
One nuissance I've found with this is that the viewer overlay will not surround the VO gestures so you need to focus it first through exploration before you can start reading the log. Also the gestures for the first and last item on the screen will move you to whatever else should be on your screen right now so you have to explore again to close the viewer. I assume that's a VO bug.
Also bear in mind that, while logging, anything your PC says will ultimately land in a regular text file and nothing apart from your Dropbox account is protecting it. Use with caution.
Feel free to suggest feedback.
#Accessibility #Tip #VoiceOver #NVDA #iPhone #iOS #Windows #Blind


Fellow #iOS #swift #swiftui developers. Do you know if the ButtonRole structure actually does anything purposeful? With #VoiceOver, I cannnot get any indication whatsoever about the button’s role, while at the AppleDeveloper documentation site: developer.apple.com/documentat…, they state that such an info should be conveyed. Is there any visual indication of the role? If so, what?


To all my #blind #Mac #VoiceOver users, I made a demo that I hope some here may find beneficial. Please do share if you feel like it. Why Turning Off VoiceOver Cursor Tracking can be Really Useful:


Good news for the @Mastodon #iOS app‘s #VoiceOver support. @jedfox has been on a roll these last few days and has added some really awesome #accessibility related pull requests to the public repository. Hope they all get merged soon and new TestFlight builds be made available so we can test them out before release. Among the fixes is the compose button, custom actions, better labels in the compose screen, and more.

Hope @Gargron or @zeitschlag can approve and merge them soon. #a11y