Skip to main content

Search

Items tagged with: VoiceOver


Reflecting on Monday’s #WWDC keynote, I must say that it was packed! Of course, many of the AI features won't be available to me initially, not being in the U.S., but some of the other features really excited me, too. Like iPhone mirroroing or some of the writing tool enhancements. Will be curious to see how well these work with #VoiceOver. The new Braille Screen Input features some of you wrote about sound really neat! I have so far resisted the temptation to (or habit of) installing the new beta on my devices, unlike the last 10+ years. I thought about treading a bit more lightly now that I am retired. And especially first few betas can be very stressful. Let#s see if I prevail. ;-)


Anybody else running #iOS 18 beta with VoiceOver experiencing a crash as soon as opening VoiceOver settings>speech? I was thinking it was because I use ESpeak, but uninstalling it didn't seem to fix it. #Apple #VoiceOver #Accessibility


Has anyone else experienced this thing in iOS Safari where #VoiceOver will keep interrupting itself with the word "close?" It will say the first few syllables of something, and just cut itself off. Each time you swipe to the next item, VO will do it again. More and more pages are becoming unusable for me. Is there any workaround for this?


When using @MonaApp with #VoiceOver, is it possible to share an image attached to a post with another app? When I use the "view media" rotor action, and then triple-tap on what VoiceOver claims is the image, I get sharing options related to the alt text, or detected text, or something, but not the image itself. In the end I had to take a screenshot of the image and use that instead.


I've just pushed a bunch of #accessibility changes for screen readers to the main branch of FediThready. ( It makes long texts into post sized chunks)

I've run through it with #VoiceOver and it _seems_ ok. HOWEVER it all feels like it's "usable" instead of "good".

If there's a #a11y geek out there who uses screen readers regularly I would REALLY appreciate some suggestions on how to make this actually feel good for folks who rely on screen readers.

github.com/masukomi/fedithread…


I wonder if this is any more #Accessible with #Voiceover? It didn't used to be usable at all. mastodon.social/@macrumors/112…


Are there any good resources for a #Blind person just buying a #Mac? I'd like to dedicate it solely to music composition, but I don't have any experience with the OS apart from what might carry over from being a regular #iPhone and #Voiceover user. Sources on learning #LogicPro from the ground up would be great too. Thanks!


@nick #Sonos has replaced its app not because they truly think the app is better. But because they can replace specialised Android, iOS, Windows, and macOS teams with one generic team who know how to use cross-platform tools.

It goes beyond that, though. Look at the ideas behind the new home screen, which essentially can be described as: "put what you want on it". Is that primarily a user-facing improvement? No.

Rather, it's a reason to not rely on designers who can carefully think through information architecture, viewport sizes, user flows, and the best ways to present information. Make it the user's problem so that they can fire the people whose responsibility it used to be, or move them to another team where they won't be able to do their best work and will eventually quit and not be replaced.

This update goes way beyond #accessibility. It's a fundamental shift in how they do business, and it will be shit for everyone. That, more than the lack of #VoiceOver support, is what will probably cause me to move away from their ecosystem.

@x0 @simon @talon


If you were wondering whether the new #Sonos app is as bad with #VoiceOver as people said, I can confirm that it is.

The first element that receives focus has no #accessible role or name, i.e. VoiceOver doesn't announce anything for it. The screen is split up into sections, like "Recently Played", "Your Services", and "Sonos Favourites", but none of these have headings. And, as previously noted, explore by touch doesn't work; VO seems to just see that blank element I mentioned as being stretched across the entire screen.

As a result of all this, the "Search" button requires 32 swipes from the top of the screen to reach, at least with my setup. If you have more services and/or more favourites, that number of swipes will be higher. #accessibility


You know, one thing I really do like about Android, Pixel works but IDK about others. When you turn off the stupid, awful, frustrating bullcrap where you have to tell your phone to "stop", shouting over the alarm to be heard... You can then double tap with two fingers, with TalkBack, to immediately stop the alarm. No need to swipe to the stop button and double tap.

Of course, just like a lot of things in Android, the Double Tap with two fingers just sends the "play/pause" signal, so it's not really a Magic Tap that apps can really make do interesting things, like how in DiceWorld on iOS, you can Magic Tap (double tap with two fingers), to roll the dice without needing to find that button each time. Stuff like that, in apps, is really nice.

Another issue with Android is the way apps handle speech; they almost always just send output of ongoing things, like live OCR results and such, to the system TTS engine instead of TalkBack. This is mainly because that's how it's always been done, but now that Braille is an option, I really hope developers start just sending announcements directly to TalkBack. On iOS, for example, I can play DiceWorld completely in Braille because it sends all announcements to VoiceOver, and not the TTS engine. See, Android has been all about speech at all cost, coming from the days of the Eyes-free shell since TalkBack couldn't use the touch screen yet. iOS, I think, has always let apps send content to VoiceOver, so it can read whatever the dev needs it to, and thus also shows up in Braille, can easily be interupted, all that.

Just some early morning thoughts, don't come at me.

#accessibility #blind #android #iOS #TTS #VoiceOver #TalkBack


Here are some initial comments:

Kindle doesn't appear in #VoiceDream as a standard content source. Instead, Amazon's web reader is embedded into the app, and the book text is extracted from the web page to be spoken.

This means that Kindle isn't deeply integrated into the rest of the app, and results in the majority of Voice Dream features being unavailable. You can choose a book and start audio playback, but not access bookmarking, text highlighting, annotations, full-text search, the built-in dictionary, etc. More fundamentally, standard book navigation (e.g. by heading) is not possible either. You can skip by page, and that's all.

I don't know if the features aimed at other audiences, such as finger reading and word highlights, work or not. I would suspect not, given the webview-based architecture, but I haven't been able to verify either way.

Meanwhile:

1. Playing a Kindle book doesn't register it as your "currently reading" item. If you relaunch the app, or close the Kindle viewer, you have to locate the book in Amazon's web interface from scratch.
2. Speaking of Amazon's web interface, selecting a book to read happens entirely within it, bringing all of the accessibility issues along for the ride that you may expect from Amazon in 2024.
3. While Kindle content is playing, the #VoiceOver magic tap gesture causes whatever non-Kindle document you were reading elsewhere in Voice Dream to resume.
4. You can pause the playback of a Kindle book on AirPods. But when you try to resume, you also trigger the previous non-Kindle document.
5. On the two books I've tried, there are large pauses in the speech stream at frequent intervals, lasting almost a second. These don't seem to line up with page changes, and I'm not sure what causes them. Maybe something related to scrolling.
6. I pressed "next page" four or five times in quick succession, to jump past all of the copyright information in a book. Unfortunately, this caused playback to completely stop working, no matter how many times I toggled it.


Sometimes, you might think that previous #accessibility wisdom has been superseded by new "facts". Maybe someone told you that #screenReaders don't work well with a particular design pattern, but you tested #ScreenReader X and it seemed to work fine. Perhaps you heard that an interactive HTML input doesn't persist with forced colours styling, but you tried a High Contrast mode in Microsoft Edge and it seemed to be there.

There are three considerations usually missing here:

1. How are you defining and evaluating the working state? Do you have a functional, accurate understanding of the #accessTechnology or accessibility feature you are asserting things about?
2. You tested one thing in relation to a statement about multiple things, e.g. a statement is made about screen readers, plural, and you only tested with #VoiceOver (it's always VoiceOver). Beyond posting on the web-a11y Slack, how do you propose testing more broadly, if you plan to at all?
3. Possibly the most critical at all: is this question worth its overheads? If answering it conclusively would require me to test ten screen readers with 45 speech engines, or seven browsers with 52 permutations of CSS properties, maybe following the advice is "cheaper" than determining whether the advice is still completely relevant.

Important disclaimer: this relates specifically to cases where following the advice would not actively make things worse for users.

TL;DR: when you know doing a thing won't make things bad, doing the thing is usually quicker than evaluating whether not doing the thing is also bad.


Can anyone recommend a good accessible #Journaling app that works on Windows and iOS? #NVDASR #VoiceOver #Accessibility #Screenreader


Okay, is there a clearly defined thing that makes voiceover stop saying "clickable" everywhere, like this word is sooo annoying, it starts to lose its meaning when you hear it more than, say, twice. I found that even low verbosity settings don't help much. I'll be glad if anyone has suggestions.Thanks. #MacOs #VoiceOver #Accessibility #Help #Blind


Hi dear #iOS users! I have a Health widget on my home screen. Is there a way to make it show steps instead of calories, percentage and what not, or it will be reason number 7001 why I don't like Apple's policies? And if it's the later, please suggest a #VoiceOver accessible app for that? Thanks!


I have to explore that use of DT further as it seems pretty niche and obscure (people on the forum didn't make such a topic before) but I am confident it can be done and it can help.
4. Paperwork: I have many invoices, serial numbers, contracts, acts and many other legal stuff. Everyone does but the problem with me is that I am helping some others with their paperwork. With advanced organization capability of DT I can very easily see which paper belongs to whom and whether I have already filled it, whether its a template that I can reuse over and over to ask my dear government for a piece of bread or whatever else that comes to mind. Another awesome thing is the automatic OCR feature, I have set up a simple smart rule which automatically OCRs every non-textual PDF, and since DT interfaces with Finereader's engine the results is very good, I would dare to say its better than the official Finereader for Mac, but that is subjective, might be inaccurate or accurate only in certain scenarios, be ware.
I have just wrote four things that came to mind otherwise this would become the second Bible, but the tool has endless use cases. Its not cheap and its not for everyone, but if you want to try it it has a very fair trial. If you have any questions, let me know. Let's learn DEVON Think together!
#blind #accessibility #a11y #VoiceOver #Mac #software (2/2)


Last week a colleague asked me for practical, real-world demonstrations from real screen reader users. I think I have decent ideas but I'd like to hear about anything others have shared and found effective. #accessibility #blind #screen #reader #JAWS #NVDA #VoiceOver


I wrote an article about how to make hastags accessible. Did some #ScreenReader testing with #JAWS, #NVDA, #VoiceOver and #Narrator, which was fun!

Pretty long one though, contains a bunch of tables for comparison. Enjoy the ride!

stevefrenzel.dev/posts/easy-we…

#accessibility #a11y #html #JavaScript #WebDev #frontend


Yesterday I started a new miniseries on my channel, in which I try to make a tutorial about a single topic in under 10 minutes. Starting with:
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver


I’ve started a new miniseries on my channel today, in which I try to make a tutorial about a single topic in under 10 minutes. Starting with:
Logic Pro Bite Size for VI's - Loading Third Party Plugins youtu.be/nIRyG-puBfs
#LogicPro #Blind #VI #VisuallyImpaired #ScreenReader #VoiceOver


If you're a #blind #Mac user, you may have never found a use for that bizarre combination of keys which is VO+Shift+F3. this disables cursor-tracking. Let me show you why this may in fact, be one of #VoiceOver's best hidden features you didn't know about.
Why_Turning_Off_VoiceOver_Cursor_Tracking_can_be_Really_Useful:


@matt I use the Spitfire Audio downloader (for my sins) on Mac and it's 99.5% accessible with VOCR, and 1% accessible with #VoiceOver. the only reason it gets the 1% is because just like when you write your name, date and title at the top of the paper you likely get marked for good spelling, we can see the close button, but nothing else. Haha.
In this instance, you'd best know how to use VOCR before you get into their libraries, without it, you're simply out of luck.
Situations like that, sure. But that's different from an email from a supposed reputable company with important information that absolutely, without question, *needs* to be in a format that is accessible to *all* customers, irrespective of ability or disability.

You don't *need* spitfire, but if a company have suffered some kind of hack, you *do* need to know about it, you know?


Greetings!

Well, after finding out that #WhaleBird
github.com/h3poteto/whalebird-…is a decent enough accessible MacOS client for use with #Misskey, you'll see me post on here from time to time now as well! Most of the time I'll still be posting on my Mastodon account, though watch this space! Three-Thousand characters is more than i ever would need myself, but I'll take it!

For users of #VoiceOver, the 'J' & 'k' keys currently do not speak under the curssor what the post is, thus nornal VO commands for now are necessary. Definitely a client for #Blind users to check out though!

NB. As noted among the GitHub page, WhaleBird is also available for #Windows and #Linux, though I'll leave those builds to you guys!


OK, this is great. Just found an #iOS equivalent of #MacWhisper and it’s free. Not sure if it always is, but it certainly is right now.
It’s #Accessible and works well with #VoiceOver.

#Aiko by Sindre Sorhus

apps.apple.com/gb/app/aiko/id1…


Not really. It's #VoiceOver for #iOS, and different synths treat it differently, but it is not a reliable or nice experience. You may have an easy ride, but that doesn't mean everyone else will, and my post proves this. There's simply no need for this kind of frilly behaviour. Standard lettering is not only understandable by a #ScreenReader, but by a non-English speaker too, who may not recognise those letters for their so-called intended purpose.


iOS Shortcuts. Darcy Burnard, from the Maccessibility podcast, did a 10 part series on understanding and using shortcuts. If you are interested in iOS shortcuts, I would highly recommend checking out this series! You can find it on the ACB Community podcast. You can either join the podcast through your favorite podcast app, or go to the website link at the end. Now there is many different things posted to this podcast feed. But just do a search for Understanding Shorcuts, and this will filter them out. The series ran from January 23rd to April 10th. Thanks to Darcy for putting in the time and doing this series! I personally listen to them through my podcast app, but here is the website link if you want it. acb-community.pinecast.co. #iOS #Shortcuts #Tip #UnderstandingShortcuts #Blind #Voiceover @DHSDarcy


Inspired by the creative use of some nifty JAWS scripting and the power of iOS shortcuts as demonstrated by @IllegallyBlind, I have decided to try my hand at creating something similar for NVDA and I think I've succeeded. Note that I'm fairly new at this and by no means a coder so this is the simplest of simple, in fact, I'm still quite amazed that it works, actually.
What we need:
1. The NVDA Speech Logger addon available at:
github.com/opensourcesys/speec…
2. The following iOS shortcut:
icloud.com/shortcuts/999808bd1…
How to use:
1. Install both: the addon in your NVDA and the shortcut in your shortcuts respectively.
2. In NVDA's settings head over to the Speech Logger category and set the output path to your Dropbox root (that's what the shortcut assumes you're using, feel free to modify as needed);
3. Start logging the local speech with the assigned gesture (by default NVDA+alt+l);
4. Assuming the shortcut is configured properly (Dropbox authorized and all that jazz), launch it and a viewer will pop up with the fresh version of the log file at the time.
One nuissance I've found with this is that the viewer overlay will not surround the VO gestures so you need to focus it first through exploration before you can start reading the log. Also the gestures for the first and last item on the screen will move you to whatever else should be on your screen right now so you have to explore again to close the viewer. I assume that's a VO bug.
Also bear in mind that, while logging, anything your PC says will ultimately land in a regular text file and nothing apart from your Dropbox account is protecting it. Use with caution.
Feel free to suggest feedback.
#Accessibility #Tip #VoiceOver #NVDA #iPhone #iOS #Windows #Blind


Fellow #iOS #swift #swiftui developers. Do you know if the ButtonRole structure actually does anything purposeful? With #VoiceOver, I cannnot get any indication whatsoever about the button’s role, while at the AppleDeveloper documentation site: developer.apple.com/documentat…, they state that such an info should be conveyed. Is there any visual indication of the role? If so, what?


To all my #blind #Mac #VoiceOver users, I made a demo that I hope some here may find beneficial. Please do share if you feel like it. Why Turning Off VoiceOver Cursor Tracking can be Really Useful:


Good news for the @Mastodon #iOS app‘s #VoiceOver support. @jedfox has been on a roll these last few days and has added some really awesome #accessibility related pull requests to the public repository. Hope they all get merged soon and new TestFlight builds be made available so we can test them out before release. Among the fixes is the compose button, custom actions, better labels in the compose screen, and more.

Hope @Gargron or @zeitschlag can approve and merge them soon. #a11y