Search
Items tagged with: VoiceOver
Calling All Blind and Low Vision Users!
Exciting news! I’ve reached out to Apple Accessibility, and they’re on board to collaborate with us. They’ve asked me to put together a team so we can work directly with their team and engineers.
Are you passionate about improving accessibility features for blind and low vision users? Do you use VoiceOver, Zoom, or other low vision features on your devices? We need your help!
We’re looking for individuals who are interested in providing feedback and testing new features from our point of view. Your input will be invaluable in making technology more accessible for everyone.
If you’re interested in participating, please fill out the form below:
forms.microsoft.com/r/eRQAsmMb…
Let’s make a difference together!
Feel free to share this post.
Best,
Matthew Whitaker
#Accessibility #BlindUsers #LowVision #VoiceOver #Zoom #AppleAccessibility #TechForAll #Inclusion #AccessibilityMatters #blind #lowvision
In light of the iOS 18 improvements to Braille Screen Input, I tried getting back into it on my iPhone 15 Pro that still runs iOS 17.5.1 release. The problem I have always had with BSI is that my fingers and it somehow never went together well. I normally have no trouble using touch screens, but BSI somehow always alluded me. And it all starts with the fact that the calibration gesture itself seems to fail me more often than it works. And it doesn’t seem to remember well the last way I used BSI, judging from the many dots that get mis-interpreted by my touches once it gets re-engaged. So, is there something I may be missing? Or a special trick to get the finger calibration to work?
![;-) ;-)](https://fedi.ml/images/smiley-wink.gif)
I've just pushed a bunch of #accessibility changes for screen readers to the main branch of FediThready. ( It makes long texts into post sized chunks)
I've run through it with #VoiceOver and it _seems_ ok. HOWEVER it all feels like it's "usable" instead of "good".
If there's a #a11y geek out there who uses screen readers regularly I would REALLY appreciate some suggestions on how to make this actually feel good for folks who rely on screen readers.
github.com/masukomi/fedithread…
GitHub - masukomi/fedithready: A tool for splitting up long posts in to fediverse sized chunks. It's a web page that lives on your computer.
A tool for splitting up long posts in to fediverse sized chunks. It's a web page that lives on your computer. - masukomi/fedithreadyGitHub
MacRumors.com (@macrumors@mastodon.social)
Attached: 1 image Little Snitch 6 Released for macOS Sonoma With DNS Encryption, Integrated Blocklists, New Traffic Chart, and More https://www.macrumors.com/2024/05/22/little-snitch-6-released-macos-sonoma/?utm_source=dlvr.it&utm_medium=mastodonMastodon
@nick #Sonos has replaced its app not because they truly think the app is better. But because they can replace specialised Android, iOS, Windows, and macOS teams with one generic team who know how to use cross-platform tools.
It goes beyond that, though. Look at the ideas behind the new home screen, which essentially can be described as: "put what you want on it". Is that primarily a user-facing improvement? No.
Rather, it's a reason to not rely on designers who can carefully think through information architecture, viewport sizes, user flows, and the best ways to present information. Make it the user's problem so that they can fire the people whose responsibility it used to be, or move them to another team where they won't be able to do their best work and will eventually quit and not be replaced.
This update goes way beyond #accessibility. It's a fundamental shift in how they do business, and it will be shit for everyone. That, more than the lack of #VoiceOver support, is what will probably cause me to move away from their ecosystem.
If you were wondering whether the new #Sonos app is as bad with #VoiceOver as people said, I can confirm that it is.
The first element that receives focus has no #accessible role or name, i.e. VoiceOver doesn't announce anything for it. The screen is split up into sections, like "Recently Played", "Your Services", and "Sonos Favourites", but none of these have headings. And, as previously noted, explore by touch doesn't work; VO seems to just see that blank element I mentioned as being stretched across the entire screen.
As a result of all this, the "Search" button requires 32 swipes from the top of the screen to reach, at least with my setup. If you have more services and/or more favourites, that number of swipes will be higher. #accessibility
You know, one thing I really do like about Android, Pixel works but IDK about others. When you turn off the stupid, awful, frustrating bullcrap where you have to tell your phone to "stop", shouting over the alarm to be heard... You can then double tap with two fingers, with TalkBack, to immediately stop the alarm. No need to swipe to the stop button and double tap.
Of course, just like a lot of things in Android, the Double Tap with two fingers just sends the "play/pause" signal, so it's not really a Magic Tap that apps can really make do interesting things, like how in DiceWorld on iOS, you can Magic Tap (double tap with two fingers), to roll the dice without needing to find that button each time. Stuff like that, in apps, is really nice.
Another issue with Android is the way apps handle speech; they almost always just send output of ongoing things, like live OCR results and such, to the system TTS engine instead of TalkBack. This is mainly because that's how it's always been done, but now that Braille is an option, I really hope developers start just sending announcements directly to TalkBack. On iOS, for example, I can play DiceWorld completely in Braille because it sends all announcements to VoiceOver, and not the TTS engine. See, Android has been all about speech at all cost, coming from the days of the Eyes-free shell since TalkBack couldn't use the touch screen yet. iOS, I think, has always let apps send content to VoiceOver, so it can read whatever the dev needs it to, and thus also shows up in Braille, can easily be interupted, all that.
Just some early morning thoughts, don't come at me.
#accessibility #blind #android #iOS #TTS #VoiceOver #TalkBack
I wrote an article about how to make hastags accessible. Did some #ScreenReader testing with #JAWS, #NVDA, #VoiceOver and #Narrator, which was fun!
Pretty long one though, contains a bunch of tables for comparison. Enjoy the ride!
stevefrenzel.dev/posts/easy-we…
#accessibility #a11y #html #JavaScript #WebDev #frontend
Easy web accessibility wins: Hashtags
Is there a way to create hashtags that work for everyone? I did some screen reader testing and was surprised by the outcome!Steve Frenzel
Thoughts And Tips After My App Was nominated For A Golden Apple Award From AppleVis
Thoughts after a surprise nomination.Me (Chris Wu)
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver
Logic Pro Bite Size for VI's - Loading Third Party Plugins
In this video I will show you how to load a third party plugin in Logic Pro as a blind user.This new mini-series aims to help a blind/visually impaired Voice...YouTube
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver
Logic Pro Bite Size for VI's - Loading Third Party Plugins
In this video I will show you how to load a third party plugin in Logic Pro as a blind user.This new mini-series aims to help a blind/visually impaired Voice...YouTube
Why_Turning_Off_VoiceOver_Cursor_Tracking_can_be_Really_Useful:
@matt I use the Spitfire Audio downloader (for my sins) on Mac and it's 99.5% accessible with VOCR, and 1% accessible with #VoiceOver. the only reason it gets the 1% is because just like when you write your name, date and title at the top of the paper you likely get marked for good spelling, we can see the close button, but nothing else. Haha.
In this instance, you'd best know how to use VOCR before you get into their libraries, without it, you're simply out of luck.
Situations like that, sure. But that's different from an email from a supposed reputable company with important information that absolutely, without question, *needs* to be in a format that is accessible to *all* customers, irrespective of ability or disability.
You don't *need* spitfire, but if a company have suffered some kind of hack, you *do* need to know about it, you know?
Greetings!
Well, after finding out that #WhaleBird
github.com/h3poteto/whalebird-…is a decent enough accessible MacOS client for use with #Misskey, you'll see me post on here from time to time now as well! Most of the time I'll still be posting on my Mastodon account, though watch this space! Three-Thousand characters is more than i ever would need myself, but I'll take it!
For users of #VoiceOver, the 'J' & 'k' keys currently do not speak under the curssor what the post is, thus nornal VO commands for now are necessary. Definitely a client for #Blind users to check out though!
NB. As noted among the GitHub page, WhaleBird is also available for #Windows and #Linux, though I'll leave those builds to you guys!
GitHub - h3poteto/whalebird-desktop: An Electron based Mastodon, Pleroma, and Misskey client for Windows, Mac, and Linux
An Electron based Mastodon, Pleroma, and Misskey client for Windows, Mac, and Linux - GitHub - h3poteto/whalebird-desktop: An Electron based Mastodon, Pleroma, and Misskey client for Windows, Mac, ...GitHub
OK, this is great. Just found an #iOS equivalent of #MacWhisper and it’s free. Not sure if it always is, but it certainly is right now.
It’s #Accessible and works well with #VoiceOver.
#Aiko by Sindre Sorhus
ACB Community
Welcome to the ACB Radio Community Podcast, home to content from ACB sponsored community events. Our community is growing and we want everyone to find their place in it.ACB Community
What we need:
1. The NVDA Speech Logger addon available at:
github.com/opensourcesys/speec…
2. The following iOS shortcut:
icloud.com/shortcuts/999808bd1…
How to use:
1. Install both: the addon in your NVDA and the shortcut in your shortcuts respectively.
2. In NVDA's settings head over to the Speech Logger category and set the output path to your Dropbox root (that's what the shortcut assumes you're using, feel free to modify as needed);
3. Start logging the local speech with the assigned gesture (by default NVDA+alt+l);
4. Assuming the shortcut is configured properly (Dropbox authorized and all that jazz), launch it and a viewer will pop up with the fresh version of the log file at the time.
One nuissance I've found with this is that the viewer overlay will not surround the VO gestures so you need to focus it first through exploration before you can start reading the log. Also the gestures for the first and last item on the screen will move you to whatever else should be on your screen right now so you have to explore again to close the viewer. I assume that's a VO bug.
Also bear in mind that, while logging, anything your PC says will ultimately land in a regular text file and nothing apart from your Dropbox account is protecting it. Use with caution.
Feel free to suggest feedback.
#Accessibility #Tip #VoiceOver #NVDA #iPhone #iOS #Windows #Blind
GitHub - opensourcesys/speechLogger: An NVDA add-on to log speech to a file. Includes support for logging remote sessions.
An NVDA add-on to log speech to a file. Includes support for logging remote sessions. - GitHub - opensourcesys/speechLogger: An NVDA add-on to log speech to a file. Includes support for logging rem...GitHub
Good news for the @Mastodon #iOS app‘s #VoiceOver support. @jedfox has been on a roll these last few days and has added some really awesome #accessibility related pull requests to the public repository. Hope they all get merged soon and new TestFlight builds be made available so we can test them out before release. Among the fixes is the compose button, custom actions, better labels in the compose screen, and more.
Hope @Gargron or @zeitschlag can approve and merge them soon. #a11y