Skip to main content

Search

Items tagged with: voiceover


Greetings!

Well, after finding out that #WhaleBird
github.com/h3poteto/whalebird-…is a decent enough accessible MacOS client for use with #Misskey, you'll see me post on here from time to time now as well! Most of the time I'll still be posting on my Mastodon account, though watch this space! Three-Thousand characters is more than i ever would need myself, but I'll take it!

For users of #VoiceOver, the 'J' & 'k' keys currently do not speak under the curssor what the post is, thus nornal VO commands for now are necessary. Definitely a client for #Blind users to check out though!

NB. As noted among the GitHub page, WhaleBird is also available for #Windows and #Linux, though I'll leave those builds to you guys!


OK, this is great. Just found an #iOS equivalent of #MacWhisper and it’s free. Not sure if it always is, but it certainly is right now.
It’s #Accessible and works well with #VoiceOver.

#Aiko by Sindre Sorhus

apps.apple.com/gb/app/aiko/id1…


Not really. It's #VoiceOver for #iOS, and different synths treat it differently, but it is not a reliable or nice experience. You may have an easy ride, but that doesn't mean everyone else will, and my post proves this. There's simply no need for this kind of frilly behaviour. Standard lettering is not only understandable by a #ScreenReader, but by a non-English speaker too, who may not recognise those letters for their so-called intended purpose.


iOS Shortcuts. Darcy Burnard, from the Maccessibility podcast, did a 10 part series on understanding and using shortcuts. If you are interested in iOS shortcuts, I would highly recommend checking out this series! You can find it on the ACB Community podcast. You can either join the podcast through your favorite podcast app, or go to the website link at the end. Now there is many different things posted to this podcast feed. But just do a search for Understanding Shorcuts, and this will filter them out. The series ran from January 23rd to April 10th. Thanks to Darcy for putting in the time and doing this series! I personally listen to them through my podcast app, but here is the website link if you want it. acb-community.pinecast.co. #iOS #Shortcuts #Tip #UnderstandingShortcuts #Blind #Voiceover @DHSDarcy


Inspired by the creative use of some nifty JAWS scripting and the power of iOS shortcuts as demonstrated by @IllegallyBlind, I have decided to try my hand at creating something similar for NVDA and I think I've succeeded. Note that I'm fairly new at this and by no means a coder so this is the simplest of simple, in fact, I'm still quite amazed that it works, actually.
What we need:
1. The NVDA Speech Logger addon available at:
github.com/opensourcesys/speec…
2. The following iOS shortcut:
icloud.com/shortcuts/999808bd1…
How to use:
1. Install both: the addon in your NVDA and the shortcut in your shortcuts respectively.
2. In NVDA's settings head over to the Speech Logger category and set the output path to your Dropbox root (that's what the shortcut assumes you're using, feel free to modify as needed);
3. Start logging the local speech with the assigned gesture (by default NVDA+alt+l);
4. Assuming the shortcut is configured properly (Dropbox authorized and all that jazz), launch it and a viewer will pop up with the fresh version of the log file at the time.
One nuissance I've found with this is that the viewer overlay will not surround the VO gestures so you need to focus it first through exploration before you can start reading the log. Also the gestures for the first and last item on the screen will move you to whatever else should be on your screen right now so you have to explore again to close the viewer. I assume that's a VO bug.
Also bear in mind that, while logging, anything your PC says will ultimately land in a regular text file and nothing apart from your Dropbox account is protecting it. Use with caution.
Feel free to suggest feedback.
#Accessibility #Tip #VoiceOver #NVDA #iPhone #iOS #Windows #Blind


Fellow #iOS #swift #swiftui developers. Do you know if the ButtonRole structure actually does anything purposeful? With #VoiceOver, I cannnot get any indication whatsoever about the button’s role, while at the AppleDeveloper documentation site: developer.apple.com/documentat…, they state that such an info should be conveyed. Is there any visual indication of the role? If so, what?


To all my #blind #Mac #VoiceOver users, I made a demo that I hope some here may find beneficial. Please do share if you feel like it. Why Turning Off VoiceOver Cursor Tracking can be Really Useful:


Good news for the @Mastodon #iOS app‘s #VoiceOver support. @jedfox has been on a roll these last few days and has added some really awesome #accessibility related pull requests to the public repository. Hope they all get merged soon and new TestFlight builds be made available so we can test them out before release. Among the fixes is the compose button, custom actions, better labels in the compose screen, and more.

Hope @Gargron or @zeitschlag can approve and merge them soon. #a11y