Skip to main content

Search

Items tagged with: Accessibility


A great exampple of why #aria labels may not always bee what you want. I see so many articles/pages/what have you on the web right now that still say "Share on Twitter". Visually it's just the X logo now, but the aria label never got updated. You should probably not be sharing anything into that cesspool anymore at this point, but I'l take the easy example of why invisible labeling = not great at times :) #accessibility


Is this correct? I always thought Access technology was a UK thing. Later I was told Access Tech was the old fashioned term…

“Access technology and assistive technology are closely related, yet they serve slightly different purposes and audiences. Access Technology is a subset of tools and devices specifically aimed at making mainstream technology usable by people with disabilities. For example, screen readers, like NVDA or JAWS, open up digital content to blind and visually impaired users by converting text to speech or braille. Similarly, captioning on videos makes audio content accessible to people who are deaf or hard of hearing.
Access technology is about creating a pathway to existing environments—digital or physical—so people with disabilities can engage equally with content and experiences designed for a general audience.
Assistive Technology on the other hand, encompasses a broader range of devices and software designed to assist individuals with disabilities in performing functions that might otherwise be difficult or impossible. This can include mobility aids, like wheelchairs or walkers, augmentative communication devices, or adaptive tools for daily living. For example, a braille note-taker supports daily productivity and communication for someone who is blind, while a prosthetic limb aids physical mobility for someone with an amputation. Assistive technology covers tools that provide support specific to the user’s needs, beyond accessibility in mainstream contexts.

Imagine a visually impaired student navigating an online course. Access technology, like a screen reader, bridges the gap by converting on-screen text to speech, allowing her to participate in the course as fully as anyone else. But to take notes or organize her studies, she may rely on a braille note-taker—an assistive technology uniquely tailored to her needs beyond just accessing information. For many, these tools are more than technology; they’re lifelines to independence, connection, and self-expression—bridging gaps and lifting limitations in ways that honour each person’s unique journey.

In simple terms:

• Access Technology helps people with disabilities use mainstream technology.
• Assistive Technology provides personalized support and tools for individuals to live more independently, across both general and unique contexts.

Their paths intersect because many access technologies—like screen readers, magnification tools, or voice recognition—are also considered assistive technology due to their role in supporting independence and empowerment. However, assistive technology is a broader term, often including unique tools tailored for very specific personal needs beyond mainstream accessibility.

As we look to the future, the lines between these two types of technology may continue to blur, with innovations that address complex needs across all contexts. Imagine AI-powered apps that identify objects, read text, or recognize faces in real-time—tools that simultaneously provide access and enhance day-to-day living.

While access technology aims to bridge the mainstream gap, assistive technology offers tailored support that supports each individual's independence.“

#Accessibility #AccessTechnology #AssistiveTechnology #Blind #Disability


If your #blind, #signal does nothing to protect your #privacy. They use #hcaptcha, so if you can’t see the images, you have to enable cross site cookies and give them a valid email. And if you want to contact signal about it, sending your device info via unencrypted email is the only way. This is not how privacy or #security work. You’re better off with iMessage. Or even watsapp! At least they don’t hand your data over to a third party captcha provider. Or send your device info via unencrypted email. #infosec#a11y#accessibility


Don’t expect Chrome to support keyboard-operable scrolling regions until (maybe) version 132 (anchor link):
adrianroselli.com/2022/06/keyb…

Only 14 versions past its target release!

#accessibility #a11y


Considering getting either an #xbox or #ps5 for the family as a #gaming platform. Any opinions as to which console has the best #accessibility setup for #blind game players?


I got so fed up of waiting for Meta to launch their AI for the Meta Raybans that I took the plunge and set up a VPN. Lo and behold, my glasses now have the AI feature. I’m really enjoying the ability to ask questions about my surroundings & get quick responses, but some of the answers have confirmed that AI is far from infallible. Today, I found a jar in the cupboard which I was pretty sure contained honey. I asked Meta and it said it was a jar of peanut butter. Wrong; it was honey, as confirmed by Be My AI. I suppose the contents are a similar colour, but I thought Meta might read the label.
But if it gets this sort of thing wrong, I’m not convinced it is going to take over the world. If it does, we’re in trouble.
#Accessibility #AI #Tech #Meta


Are you interested how to use Apple notes as an #accessible graphing calculator? Check out this post

nunonuno.micro.blog/2024/11/04…
#accessibility #apple #blind #ios #math #education #tech


🤐disabled and obscured

"My personal take on this is it sucks, as while the control is disabled for all, only for a subset of users (low vision) the text label for the disabled control is illegible."

#a11y #HTML #WCAG #accessibility

html5accessibility.com/stuff/2…


A Day with JAWS 2035: When Your Screen Reader Scripts Itself

The morning light filters through your smart windows, casting a warm glow across the room. Your ambient AI assistant hums gently, “Good morning, Lottie. Would you like to prepare your workspace for the day?”

“Yes, please,” you say, stretching as the AI readies your home office. The blinds adjust automatically, leaving just enough sunlight to boost your energy without causing glare on your neuro-linked glasses. You smile, reflecting on the advances in technology since the days of fiddling with manual screen reader settings and customized scripts. Those days feel like a distant memory, thanks to JAWS’ AI-powered self-scripting feature—your personal assistant that knows exactly how to handle your work routine.

“Let’s get started,” you say, and JAWS springs to life, adjusting the audio tone to your preferred voice—smooth, confident, efficient. As your desktop computer powers on, JAWS begins analysing the applications you’ve opened, sensing your usual email, project management software, and a new program you’ve recently started exploring.

JAWS’ Real-Time Autonomous Scripting: A Custom Fit

“Good morning, Lottie. I’ve detected a new application in use: ResearchHub. Would you like me to generate an initial script for it?” JAWS asks in a gentle tone, its voice coming through the bone conduction implant in your ear.

You nod. “Yes, go ahead and script it.” This isn’t just any regular software; ResearchHub is dense, designed for researchers and developers with an intricate layout. In the past, navigating such software would have required hours of manually creating scripts or waiting for accessibility support. But today, JAWS’ AI-driven self-scripting feature allows it to analyse this program’s unique design and build custom commands as you go.

“Noted. I’ll adapt based on your usage patterns,” JAWS replies, instantly highlighting an unlabelled menu item. “I’ve labelled this as ‘Data Analysis.’ Would you like a shortcut assigned for quick access?”

“Absolutely,” you reply. Moments later, JAWS has created a keystroke, Control-Shift-D, which will take you directly to the Data Analysis section.

As you dive into your tasks, JAWS continues observing your interactions, quietly scripting shortcuts and macros that save you time with each click. You switch over to an email thread about your latest project, and JAWS dynamically adjusts, making sure to read each new message aloud with just the right level of detail. It’s responsive, intuitive, and seems to understand the flow of your work better than ever.

### Adaptive Behaviour Learning: Anticipating Your Needs

JAWS has learned over time what works best for you—like knowing when you prefer concise summaries over detailed descriptions or when to read full email threads aloud. Today, though, as you work through complex calculations in ResearchHub, JAWS picks up on repeated actions, noting your frequent need to access specific data fields.

Without you having to prompt it, JAWS speaks up, “Lottie, I’ve noticed you’re navigating back and forth to the Analysis Settings panel. Would you like me to create a macro for this?”

“Yes, that’d be great,” you reply, surprised at how quickly JAWS anticipates these needs. It assigns a simple command, Control-Alt-S, making it even easier for you to access the settings. With each task, JAWS quietly observes, creating personalized shortcuts and learning how to refine your workflow without interrupting your focus.

Your screen reader feels less like a tool and more like an assistant that adapts to your habits, reducing unnecessary actions and helping you move seamlessly between applications. You take a moment to appreciate the leap from manually scripting these shortcuts to having them generated in real-time, tailored perfectly to your unique style.

Dynamic Accessibility Adjustment: Visual Recognition on the Fly

Halfway through the day, you open a report in a new format. The document is packed with complex graphics, diagrams, and untagged elements—historically a nightmare for accessibility. But JAWS, equipped with advanced AI-powered visual recognition capabilities, is ready.

“Diagram detected: This appears to be a bar graph comparing quarterly performance,” JAWS announces, automatically analysing the content. “Would you like a detailed audio description, or should I just provide the key values?”

“Let’s go with the key values,” you respond, eager to save time. In seconds, JAWS summarizes the data, translating it into accessible content without needing additional third-party support. When you encounter z buttons in another application, JAWS instantly identifies them and provides real-time labels, adjusting the accessibility on the fly.

The thought crosses your mind how revolutionary this is. You’ve moved past needing someone else to make documents or software accessible for you. Instead, your screen reader adapts and scripts the solution independently, as if it’s actively learning how best to support you.

A Collaborative Community of Scripts

As the day wraps up, JAWS asks, “Lottie, would you like to share the custom scripts I created for ResearchHub with the community repository? Other users might find them useful.”

“Yes, please,” you reply. Knowing that the scripts you and JAWS have tailored today could now benefit others brings a sense of community to your day. In the past, each user’s customization stayed personal, but today, JAWS’ community sharing feature allows anonymized scripts to be uploaded to a shared repository, where other users can download them for similar applications. This feature isn’t just a convenience—it’s a small way to contribute to something larger than yourself.

You smile, thinking about the ripple effect of this community effort. As JAWS users across industries contribute their self-generated scripts, the database grows, improving access for everyone.

Reflecting on Progress: A New Kind of Independence

As you finish your work, JAWS reads aloud your notifications, wrapping up your day with a recap. You reflect on how far technology has come since those early days of assistive devices. Back then, using a screen reader required you to work around its limitations, painstakingly scripting or finding ways to access inaccessible software. Today, your screen reader does the heavy lifting, allowing you to focus on your work without the constant barrier of inaccessible content.

Looking back, you remember those initial frustrations, the hours spent tinkering with manual scripts, and the reliance on tech support for inaccessible programs. Now, JAWS’ AI-powered self-scripting has not only given you more control but also reinforced your independence. It’s not just a tool—it’s a partner in productivity.

As you power down, you realize that technology has not replaced your determination; it has amplified it. JAWS has become a proactive assistant, predicting your needs, adjusting to your habits, and making the inaccessible accessible. With the day’s tasks complete, you feel a renewed sense of autonomy—knowing that the tools at your fingertips truly work for you, enhancing not just your productivity but your entire work experience.

The screen fades to black, and the AI’s voice recedes, leaving you with a quiet appreciation for a world where technology supports your strengths, not your limitations.

#Accessibility #AccessAssistive #AI #AssistiveTechnology #Blind #Disability #JAWS #ScreenReader


Um, so I'll need a sighted person for this, but could someone look at my blog's home page & let me know how it looks, please? I was trying for a certain background/color scheme vibe & need to know if things look screwy. blindsquatch.me #WebDesign #Background #ColorScheme #A11Y #Accessibility #Blind #AskFedi


This article explores the barriers blind students encounter when accessing learning content online. Some of their findings include images lacking alternative text, math content presented as images, inaccessible tables, and buttons without labels. The article also highlights accessibility barriers present in video games too. Please check out the full article for more information. #accessibility proquest.com/openview/793393e2…


*sigh* submitted an issue asking for some accessibility work to be done to a software package and the response was "there's no audience for this, someone who actually uses a screen reader should submit this request". I am definitely not the person to reply to this and have no idea how to do so in a reasonable manner.

#accessibility


Blind Fedi, do you know of any accessible platform for hosting realtime trivia/quizzes? I was asked by someone who wants to organize trivia nights mainly for blind people over Zoom and have no experience with such services. The participation process is more important in terms of having to be accessible because the organizer will be sighted. Boosts appreciated #a11y #accessibility


In my 20 years of podcasting, I’ve tried many services for recording podcast interviews. I’ve found nothing better than Cleanfeed, because it’s reliable and flexible. While there are services that record locally and then upload, all of them glitch out at one time or another, because there is a lot of complexity in getting that right. It’s heartbreaking to lose a good interview because your double-ender service had a bad hair day. It happens far too much.
Cleanfeed is also good for broadcasting and other live events. During the lockdowns, many broadcasters were using Cleanfeed to send entire shows, including high quality stereo music streams, back to radio stations.
I did an explanation and demonstration of Cleanfeed in The Blind Podmaker podcast which is still relatively current.
I say all this because Cleanfeed has just come out with a very nice update for iPhones using USB-C. You can now use a USB-C audio device including audio interfaces, come into Cleanfeed through safari for iOS, and Cleanfeed will recognise the audio device. This is going to be a big deal for people who want to participate in high quality podcasts or live broadcasts while on the go. If a radio station has a broadcaster in the field for some reason, coming in through Cleanfeed and a USB-C audio device is going to sound way better than a FaceTime call.
Best of all, Cleanfeed’s developers care deeply about #accessibility. They’ve responded positively to many of my suggestions and those of others in the blind podcasting community.
You can find Cleanfeed at cleanfeed.net.


💟 1 year ago today I joined TetraLogical, the best group of #accessibility practioners I could ever wish to work with.

If you need practical, expert advice, you know where to get it 🖖🏽

tetralogical.com/services/


Gentle, biannual reminder that RSS is great and can be another tool in your toolbox to curate your online experience, improve mental health, and stay informed. It's also a much more accessible way to consume news if you have a client that handles full-text extraction. I've just updated my public list of RSS feeds after going nearly a whole year without doing so: tristanb.me/feeds/. this is a comprehensive list of most of the feeds that I follow, split up by category. #RSS #accessibility


It's probably not a secret that often apps template their update descriptions so as to save themselves time and translation resources, etc, etc. So then here is a cool update from one of the apps I have: “We are constantly developing the J.O.E.® app. Some of the advantages you currently benefit from: - Bug fixes”. Pretty original, wouldn't you agree? #accessibility #ios


Yesterday, my students presented our work at the ACM ASSETS conference. "Not Only Annpying. But Dangerous": Devising an Ecology of Protections for Photosensitive Social Media Users" dl.acm.org/doi/abs/10.1145/366…

In this study, we investigate prior work, conduct survey inquiries, and use co-design methods to explore how social media design choices influence exposure to dangerous flashing content which can trigger seizures, migraines, nausea, and disorientation for photosensitive users.

Through our analysis, we identify the current ecosystem of flashing content on the Internet, and propose a more robust ecology of protections, including on-device graphics filters that directly edit pixels buffers to prevent flashing before it occurs.

First, existing WCAG guidelines against auto-play of media need to be enforced. Second, users should have device level control over animation that may trigger flashing, and this control shpuld not be able to be reset by platforms that try to enforce autoplay to support their own ad revenue. Third, other users need to be aware of what makes content dangerous, so thay they may stop circulating it and causing accidental exposure. Fourth, creators need to know what makes content dangerous, and how they can test for danger, to prevent them from creating dangerous media in the first place. This includes corporate creators, like movie studios, whose ads for acrion movies have been a recent source of autoplaying strobing content in movie trailors posted via ad platforms. Fifth, platforms, including GIF libraries, but also all social platforms, need to implement reporting mechanisms specifically for flashing content which can remove that content from circulation. Sixth, there should be actual penalties for platforms and creators that do not react to, correct, and remove dangerous content, or who force auto-play on users.

And seventh, device manufacturers and operating system developers need to create on-device filters that eliminate flashing through simple real time post-processing. Machine Learning classification and prediction algorithms ARE NOT NECESSARY. We can do this with simple math. Yes it may sometimes look weird. But also people won't be dying in their sleep.

This work is very important to me, and I've been working on it (on the side) since 2017. I wasn't allowed to pursue it fully as a graduate student. As faculty, I still had to string the project together on wishes. And I'm still looking for a collaborator to work on the implementation, though if industry would just get their shit together and do it themselves, that'd be great.

#ASSETS2024 #accessibility #Epilepsy #migraine #photophobia #photosensitivity #NEISVoid #graphicsProgramming


Ever find it hard to keep track of links, buttons, and form fields on a cluttered webpage? NVDA’s Screen Layout feature helps! When enabled, items like links and buttons stay in their visual flow, appearing on the same line. When disabled, NVDA separates each link and control onto its own line, making navigation smoother for some users. Toggle Screen Layout on and off with NVDA+V to customize your browsing experience! #NVDA #Accessibility #ScreenReader #TechTips


Back in the old days, we used alt text because folks didn't always want to wait for them to download.

I'm sitting at the end of my driveway, ready to hand out candy, connected to the house wifi.

I'm not getting photos 🤣

I am currently very thankful for those who have added alt text to their images and videos!

#Halloween #a11y = #accessibility


Happy birthday to ARPANET, the forerunner of the modern internet! 53 years ago, the first message was sent over this pioneering network, paving the way for a world of interconnection and innovation.

As a tech enthusiast, I am constantly amazed by the ways the internet has transformed our lives, allowing us to communicate, learn and share ideas across boundaries and borders. From social media to e-commerce, from telemedicine to remote work, the internet has become an essential part of our daily routines, enabling us to connect with others and access a wealth of information at our fingertips.

#ARPANET #InternetAnniversary #Transhumanism #TechInnovation
#InternetHistory #Networking #DigitalRevolution #Innovation #Communication #Cyberculture #OnlineCommunity #TechHerstory #NetNeutrality #OpenSource #Decentralization #DataPrivacy #Cybersecurity #DigitalRights #Accessibility #Inclusion #SocialImpact #DigitalTransformation #FutureTech #TechOptimism #InternetOfThings #AI

🔗 ethw.org/Milestones:Birthplace…


🎶 **Excited to Share My New Arrangement of "Thriller" by Michael Jackson!** 🎶

I had so much fun making this track come to life in my own style and couldn’t have done it without the amazing gear and incredible people who helped along the way!

**Special Thanks To:**
- **Viscount International** 🎹 for that iconic organ sound—big shout-out to Delroy White for taking it up a notch with some custom tweaks.
- **Yamaha Drums** 🥁 bringing the heartbeat of this track.
- **Audix Mics** 🎤 for capturing the drum sound perfectly.
- **Pianoteq** for that smooth Rhodes sound in the background, adding that perfect touch.
- **Nord** 🎹 used the Stage 4 for some awesome Clav sounds.
- **Surge Synthesizer** 🎛️ powered up tons of synth sounds! Surge is fully accessible and open-source, and for all my blind producers, this one is a must.

Shout out to Mike Gardner for making it all sound amazing with the final mix!

Check out these links to all the fantastic companies that helped bring this track together:

🎹 Viscount International - [viscount link]
🥁 Yamaha Drums - [yamaha link]
🎤 Audix Mics - [audix link]
🎹 Pianoteq - [pianoteq link]
🎹 Nord Keyboards - [nord link]
🎛️ Surge Synthesizer - [surge link]

**#Thriller #MusicProduction #ViscountOrgan #YamahaDrums #AudixMics #PianoteqRhodes #NordStage4 #SurgeSynth #BlindMusician #MichaelJackson #MatthewWhitakerMusic #Accessibility #MusicForEveryone**


A friend of mine, also blind and a very advanced user, tried to send me a Zoom meeting invitation with a Mac. Here's what happened:
1. Instead of a meeting at 4 pm the meeting was set to last from 8 PM previous day to 4 PM next day;
2. the appointment location contained only this: "[Name Redacted] is inviting you to a Zoom meeting.";
3. The appointment body was totally empty, so no Zoom link was available.
That's all you need to know about #Accessibility and usability of a Mac with #VoiceOver.
To prevent a holy war of devoted Mac fans, I am not telling you this is completely impossible. I'm just telling you that on Windows it's a breeze for a blind user and I even don't know how to achieve such messy results here.


Accessibility Courses from Teach Access! teachaccess.org/accessibility-… "These free, self-paced courses for educators provide an introduction to disability and the principles of accessible design across various disciplines" #a11y #webdev #webdevelopment #webdesign #accessibility


Question for #screenreader users: do text emotes like kaomoji generally cause your tools to read out noise or annoying nonsense, or does it just not pronounce it? I am wondering whether it's okay to use them or whether I should go back to good old emoji (that, to my knowledge, get properly read out).

Like this one:
˚‧º·(˚ ˃̣̣̥᷄⌓˂̣̣̥᷅ )‧º·˚

#accessibility #totallyblind


What do people think of this response to ‘generate detailed Alt Text’ by Gemini:

Here is some detailed alt text for the image you provided:
Alt Text:
A large, menacing grey cat, reminiscent of a monster, stands on its hind legs with its front paws raised. It has glowing yellow eyes and a wide, toothy grin. The cat is surrounded by a fiery aura as it emits a powerful beam of energy from its mouth. The beam is directed towards the Empire State Building in New York City, which is engulfed in flames. The city skyline is visible in the background, with tall buildings and smoke rising from the destruction. The overall tone of the image is chaotic and destructive, evoking a sense of fear and awe.
Additional Notes:
• You may want to adjust the description of the cat's appearance based on its specific characteristics.
• If the image is part of a larger context (like a movie poster or book cover), you could add a brief reference to that context.
• The alt text should be concise and informative, providing a clear description of the image without being overly detailed.
Let me know if you have any other questions.

#Accessibility #AI #AIsoftheBlind #AltText #Blind #ComputerVision


I finally set firefox as my main browser. Last time I used it with NVDA, it kept crashing and crashing and crashing. Let's see what happens this time. All I know is that I cannot use a browser that kneecaps my ability to block ads. The web becomes borderline unusable with ads, especially if you're blind and have to manually trudge through iframes and janky Js coughed up by an ad server one DOM element at a time.
#nvda #blind #ublockorigin #accessibility #firefox


It’s great to be Abel to finally use iPhone Mirroring feature in Mac OS 15.1. It’s kind of magical with VOiceOver until you have to hit the “back” button from inside the app. Still can’t figure out how this is done. Web views don’t seem to work yet either. #accessibility


Hey fellow #BlindMastodon users! I'm looking for recommendations on two types of writing tools for Windows:

1. Accessible text expanders
2. Word prediction software

I use both #JAWS and #NVDA screen readers. I tried Espanso (an open-source text expander), but it didn't work well with NVDA. I also tested Lightkey for word prediction, but it didn't seem accessible.

As an #ActuallyAutistic person, word prediction would really help with my autism-related communication challenges.

Have you found any text expanders or word prediction tools that work well with screen readers? What has your experience been like? I'd love to hear your recommendations and thoughts!

#Accessibility #AssistiveTechnology #TextExpander #WordPrediction #ScreenReader #Windows #AutismAccommodations #autism #blind @mastoblind @main


Have people found that things have improved regarding the Facebook #accessibility bug that meant you couldn’t read back what you’d typed when composing a post?
Any other pain points around Facebook you would like addressed?


After 6 years – we finally got to close our longest-running #accessibility ticket in Wagtail, github.com/wagtail/wagtail/iss…. Thank you @mush42 for reporting this initially. We benefitted immensely from your accessibility expertise and first-hand experience with a screen reader ❤️

This was the ticket I created our "accessibility" ticket triage label for. 38 of those still opened, 165 closed. Kick-ass!


It’s Mid-2022 and Browsers (Mostly Safari) Still Break Accessibility via Display Properties — Adrian Roselli

Grat article with tons of updates on this topic. Thanks @aardrian

adrianroselli.com/2022/07/its-… #a11y #accessibility


Remember in August when I gave a talk about HTML tables for the WebAIM online conference?

That video is online!

I have embedded it on my site:
adrianroselli.com/2024/08/talk…

Or you can go to YouTube directly (where you can find the other swell talks from that day).

#accessibility #a11y #HTML


For me, @joplinapp@mastodon.social is the most important #ToDo and #wiki tool for my efficient daily #work and #organization.

With the new version, there are some improvements for #accessibility, such as the use of #screenreaders.

I hope #blind people can enjoy this #app.

github.com/laurent22/joplin/is…
#OpenSource #a11y


Folks, there really is something sliding backwards in #Accessibility but I can't quite pinpoint it, but it's showing itself every time I invoice a client but I spent that whole time convincing the small business that, well, actually, accessible design is good. Toolbars won't fix accessibility. Your computer can make text bigger. Your monitor can brighten and lower brightness. your computer has a Magnifier in it.

I'm not joking. Back in 2006, you had to use those advanced accessibility talks, but not anymore! Now I get paid to tell people that there are themes in Wordpress that make stuff more accessible without toolbars and yes you can increase your mouse size.

Seriously, what's going on? I mean, I'll have income for a hell of a long time without really doing anything super taxing these days but wow! #WebDev


I would like to see us get *well* past “accessibility is good, actually”-level talks about accessibility.

With the title of this talk, I was hoping for a case study on an open source project that put effort into having accessible tools & processes and how that made it possible for disabled devs to participate…because I think often it’s assumed that sure, disabled people can be users, but they wouldn’t be devs.

#AllThingsOpen2024 #accessibility


I do wonder just how serious about #accessibility is the team behind the MuseScore software. At least on Mac OS things are broken to the point of making it really quite counter-productive to use it. Broken tooltips, strange keyboard navigation, unfinished dialogs, etc, etc. I've been checking for updates for ever hoping to see some progress, but progress I see none.


Out of interest, how well does WhatsApp perform on Windows with Jaws and NVDA? On my Mac, VoiceOver focus keeps jumping all over the place, and you can’t read back what you’re typing. #Accessibility


It’s no secret that many of us in the blind community have embraced the rapid advances in Artificial Intelligence over the past two years. We've witnessed firsthand how these technologies can be a powerful force for good, especially within our community. AI-generated image descriptions have revolutionized how we navigate the online world, offering a perspective previously unimaginable. This impact is now undeniable, transforming how we interact with the world.”

I’ve declared the kingdom of the blind a republic—perhaps prematurely, but only by a small margin. With AI empowering us to perceive the digital world in new ways, we are no longer ruled by limitations, but actively shaping our future. Anthropic’s recent launch of ‘computer use’ marks the first steps into a new phase of AI evolution—one where AI agents begin to act independently on our behalf, initiating a shift in how we interact with technology.

As AI continues to evolve, so too will the Assistive Technology that many of us depend on. I envision a future where this intelligence becomes a true companion, guiding us seamlessly through both digital landscapes and real-world challenges. We may be just two years away from seeing JAWS, NVDA, or SuperNova transform into true Assistive Intelligence 1.0—or perhaps it will take a little longer. If AI has taught us anything, it’s that progress comes both more slowly than we expect and faster than we can possibly imagine.

What follows is my first attempt at describing how a screen reader of today could take the first steps towards becoming an Assistive Intelligence. If anyone wants to build it, I’d love to help if I can. Whatever you think, let me know what you think:

“Proposed AI-Powered Self-Scripting Feature for JAWS Screen Reader

Objective
The suggested feature seeks to integrate advanced AI-driven "computer use" capabilities, like those developed by Claude (Anthropic), into the JAWS screen reader. This functionality would enable JAWS to autonomously create and refine custom scripts in response to real-time user interactions and application environments. The aim is to enhance accessibility and productivity for visually impaired users, especially when navigating non-standard or otherwise inaccessible software interfaces.

Feature Description
The self-scripting capability would empower JAWS to analyse user interactions with applications, identify recurring actions or inaccessible elements, and generate scripts that optimize these processes. By enabling JAWS to perform this autonomously, users gain seamless and personalized access to applications without manual intervention, allowing for an enhanced, efficient experience.

The self-scripting feature will be powered by the following core functions:

1. Real-Time Autonomous Scripting: JAWS would use AI to observe user interactions with applications, especially non-accessible ones, and automatically generate scripts that improve navigation, label untagged elements, and streamline frequent tasks. For example, if a user frequently navigates to a particular form field, JAWS could create a shortcut to this area.

2. Adaptive Behaviour Learning: This feature would allow JAWS to recognize patterns in a user’s interactions, such as repeated actions or commonly accessed elements. JAWS would adapt its behaviour by creating custom macros, enabling faster navigation and interaction with complex workflows.

3. Dynamic Accessibility Adjustment: Leveraging Claude’s approach to visual recognition, JAWS could interpret visual elements (like buttons or icons) and provide instant labelling or feedback. This would be valuable in software with minimal accessibility features, as it enables JAWS to make live adjustments and effectively “teach itself” how to navigate new environments.

4. Community Script Sharing: Self-generated scripts, once verified, could be anonymized, and made available to other users via a shared repository. This would foster a collaborative environment, empowering users to contribute to a broader database of accessibility scripts for applications across various industries.

Value Proposition
This feature will address key challenges for visually impaired users, including the complexity of navigating inaccessible interfaces and the time-consuming nature of repetitive tasks. The ability for JAWS to generate its own scripts autonomously would mean:
1. Increased Accessibility: Improved interaction with non-accessible software interfaces.
2. Higher Productivity: Reduced need for external support or manual scripting, allowing users to accomplish tasks more independently.
3. Enhanced User Experience: Scripting and macro creation based on personal usage patterns -- leads to a more intuitive and personalized experience.

Technical Considerations
1. Performance: Processing real-time visual and user interaction data requires substantial computing power. A cloud-based model may be optimal, offloading some processing requirements and ensuring smooth, responsive performance.
2. Safety: Automated scripting must be closely monitored to prevent unintended interactions or conflicts within applications. Integration of safeguard protocols and user settings to enable/disable autonomous scripting will be essential.
3. Privacy: To ensure user data is protected, anonymization protocols and data privacy standards will be implemented. Data collected from user interactions would be handled in compliance with rigorous privacy standards, safeguarding user preferences and behaviour.

Conclusion
Integrating AI-powered self-scripting capabilities into JAWS would represent a significant leap in screen reader technology. By allowing JAWS to, when requested, autonomously learn, adapt, and script in response to user needs, this feature could provide visually impaired users with unprecedented control and flexibility in navigating digital environments, fostering both independence and productivity. The anticipated benefits underscore the feature’s potential to redefine accessible technology, turning screen reader into Assistive Intelligence.“

About the Author:

Lottie is a passionate advocate for the transformative potential of AI, especially within the blind and visually impaired community. She blends technical insights with a keen awareness of lived experiences, envisioning a future where AI doesn’t just assist but truly empowers. Her thoughtful reflections explore the shift from a "kingdom of the blind" to a republic, where emerging technologies like AI create new opportunities for autonomy and inclusion.

With a balance of optimism and critical realism, Lottie acknowledges the game-changing impact of AI tools like image descriptions while recognizing that more progress is needed. Her vision extends to the idea of "Assistive Intelligence," where screen readers like JAWS evolve into proactive companions, adapting to users' needs in real-time.

Known for turning complex ideas into actionable blueprints, Lottie is not just an observer of technological trends but a catalyst for innovation. Her proposals reflect a desire to elevate independence and productivity for blind users, pushing the boundaries of what's possible in assistive technology. Her insights continue to inspire conversations and shape the future of accessible tech.

I am the Blind AI, relying on AI every day to enrich my life. While my posts may occasionally benefit from AI assistance, the thoughts, perspectives, and final edits are entirely my own. AI is my tool, much like a calculator or spell-check, refining my expression but never replacing my voice.

#Accessibility #AI #AIsoftheBlind #Blind #ComputerVision #Disability #Innovation #JAWS #NVDA #ScreenReader #SuperNov