"Screen readers do not need to be saved by AI"

TetraLogical's @craigabbott has written a post on his own blog exploring why we shouldn’t expect screen readers to be augmented with AI to fix problems with bad content.

The real problem is producing inaccessible content from the start, such as misusing emojis, poor descriptions, or unclear writing.

craigabbott.co.uk/blog/screen-…

#Accessibility #ScreenReaders #InclusiveDesign

in reply to TetraLogical

As a former screen reader developer myself, I mostly agree with this article. But, for the specific exzample of the clapping hands emoji, I think it would be easy enough to add a special case to the screen reader -- not "AI", but just a good old hand-coded heuristic -- to filter the text and then play a clapping-hands sound effect synchronized with each of the words. I just wonder if there's broad consistency on whether the emoji comes before or after the word.

Matt Campbell reshared this.

in reply to Matt Campbell

@matt I really feel like this insistence that people not add "fun" expressions to their language is never, ever going to be a viable strategy for addressing accessibility more broadly. Gen alpha slackers attempting to use indecipherable emoji algospeak to impress their friends or evade algorithmic filters in their Instagram posts are not going to be persuaded to read a 216-page $30 book before they write anything.
in reply to Glyph

@glyph @matt you’re right, people aren’t going to randomly just pick up a book and read about it. But, I think education is a key part to all of this. I just think it needs to be taught it earlier, in things like key skills communications, and computer science curriculums. Accessibility doesn’t get taught during those formative years, where it would likely have the most impact. We’re trying to re-teach every new generation retrospectively.
in reply to Glyph

@matt Like, Know Your Meme added a "clap emoji" entry 9 years ago. Why are we carrying water for NV Access for failing to implement anything to detect that pattern for nearly a decade? Does *Apple* not have the resources to code the ability for VoiceOver to interpret the 10 or so most popular unicode & emoji idioms at the rate of one per year? Saying we need to train kids to avoid fun rather than demand more from these companies seems backwards.
in reply to Glyph

@glyph You're definitely right about Apple. I'm much more sympathetic toward NV Access, because they're a tiny non-profit working on an open-source project. And if I remember correctly, NVDA only gained an internal TTS API rich enough to support sound effects synchronized to individual words in like 2019 or 2020. (I had written that feature in 2003, for the purpose of indicating links with a tone, but I can understand having different priorities.)

@craigabbott @TetraLogical

in reply to Matt Campbell

@glyph The company that I _really_ think we shouldn't carry any water for is Vispero, the current owner of JAWS. Clearly at this point the company is mainly trying to squeeze more money out of what IMO can justifiably be described as the WinZip of screen readers. I make that comparison deliberately; WinZip was at one point owned by the same private equity group that invested in Vispero. And both products now have high-quality free counterparts.

@craigabbott @TetraLogical

in reply to Matt Campbell

@glyph To clarify, when I said "I wrote that feature in 2003", I meant I implemented sound effects synchronized with TTS _in my company's own product_. And I was still entirely new to developing a screen reader, or more precisely, a talking web browser, back then. I was also quite new to _using_ a GUi screen reader, so I brought my own ideas about how they should work. Maybe the current screen readers are getting long in the tooth and it's time for a new generation.
in reply to Glyph

@matt I am just frustrated with the way that a lot of the way accessibility breaks through as a topic into broader conversations is "look at this common mode of communication. look at this hilarious failure mode that a screen reader has when it's used. now that you know that, everyone stop doing it". It's practically a meme template at this point. I'm sure that you've done tons of other great stuff on the subject that I've never seen because it didn't break through.
in reply to Glyph

@matt To keep things on the topic of your actual words, I do disagree with the sentence from your article: "The responsibility for accessibility lies with the person creating the content". This is individualizing a systemic problem. It's applying scaling leverage at the most inefficient point. Fix ~100 screen readers or train a billion or so sighted people? One of Those numbers is much smaller than the other.