Search
Items tagged with: assistivetechnology
I'm curious to hear from fellow blind users about your favorite accessible Windows apps! I just reinstalled Windows and would love some recommendations. What apps do you find essential for daily tasks, productivity, or just for fun? Please share your favorites! #Blind #VisuallyImpaired #Accessibility #WindowsApps #AssistiveTechnology #Windows
Is this correct? I always thought Access technology was a UK thing. Later I was told Access Tech was the old fashioned term…
“Access technology and assistive technology are closely related, yet they serve slightly different purposes and audiences. Access Technology is a subset of tools and devices specifically aimed at making mainstream technology usable by people with disabilities. For example, screen readers, like NVDA or JAWS, open up digital content to blind and visually impaired users by converting text to speech or braille. Similarly, captioning on videos makes audio content accessible to people who are deaf or hard of hearing.
Access technology is about creating a pathway to existing environments—digital or physical—so people with disabilities can engage equally with content and experiences designed for a general audience.
Assistive Technology on the other hand, encompasses a broader range of devices and software designed to assist individuals with disabilities in performing functions that might otherwise be difficult or impossible. This can include mobility aids, like wheelchairs or walkers, augmentative communication devices, or adaptive tools for daily living. For example, a braille note-taker supports daily productivity and communication for someone who is blind, while a prosthetic limb aids physical mobility for someone with an amputation. Assistive technology covers tools that provide support specific to the user’s needs, beyond accessibility in mainstream contexts.
Imagine a visually impaired student navigating an online course. Access technology, like a screen reader, bridges the gap by converting on-screen text to speech, allowing her to participate in the course as fully as anyone else. But to take notes or organize her studies, she may rely on a braille note-taker—an assistive technology uniquely tailored to her needs beyond just accessing information. For many, these tools are more than technology; they’re lifelines to independence, connection, and self-expression—bridging gaps and lifting limitations in ways that honour each person’s unique journey.
In simple terms:
• Access Technology helps people with disabilities use mainstream technology.
• Assistive Technology provides personalized support and tools for individuals to live more independently, across both general and unique contexts.
Their paths intersect because many access technologies—like screen readers, magnification tools, or voice recognition—are also considered assistive technology due to their role in supporting independence and empowerment. However, assistive technology is a broader term, often including unique tools tailored for very specific personal needs beyond mainstream accessibility.
As we look to the future, the lines between these two types of technology may continue to blur, with innovations that address complex needs across all contexts. Imagine AI-powered apps that identify objects, read text, or recognize faces in real-time—tools that simultaneously provide access and enhance day-to-day living.
While access technology aims to bridge the mainstream gap, assistive technology offers tailored support that supports each individual's independence.“
#Accessibility #AccessTechnology #AssistiveTechnology #Blind #Disability
A Day with JAWS 2035: When Your Screen Reader Scripts Itself
The morning light filters through your smart windows, casting a warm glow across the room. Your ambient AI assistant hums gently, “Good morning, Lottie. Would you like to prepare your workspace for the day?”
“Yes, please,” you say, stretching as the AI readies your home office. The blinds adjust automatically, leaving just enough sunlight to boost your energy without causing glare on your neuro-linked glasses. You smile, reflecting on the advances in technology since the days of fiddling with manual screen reader settings and customized scripts. Those days feel like a distant memory, thanks to JAWS’ AI-powered self-scripting feature—your personal assistant that knows exactly how to handle your work routine.
“Let’s get started,” you say, and JAWS springs to life, adjusting the audio tone to your preferred voice—smooth, confident, efficient. As your desktop computer powers on, JAWS begins analysing the applications you’ve opened, sensing your usual email, project management software, and a new program you’ve recently started exploring.
JAWS’ Real-Time Autonomous Scripting: A Custom Fit
“Good morning, Lottie. I’ve detected a new application in use: ResearchHub. Would you like me to generate an initial script for it?” JAWS asks in a gentle tone, its voice coming through the bone conduction implant in your ear.
You nod. “Yes, go ahead and script it.” This isn’t just any regular software; ResearchHub is dense, designed for researchers and developers with an intricate layout. In the past, navigating such software would have required hours of manually creating scripts or waiting for accessibility support. But today, JAWS’ AI-driven self-scripting feature allows it to analyse this program’s unique design and build custom commands as you go.
“Noted. I’ll adapt based on your usage patterns,” JAWS replies, instantly highlighting an unlabelled menu item. “I’ve labelled this as ‘Data Analysis.’ Would you like a shortcut assigned for quick access?”
“Absolutely,” you reply. Moments later, JAWS has created a keystroke, Control-Shift-D, which will take you directly to the Data Analysis section.
As you dive into your tasks, JAWS continues observing your interactions, quietly scripting shortcuts and macros that save you time with each click. You switch over to an email thread about your latest project, and JAWS dynamically adjusts, making sure to read each new message aloud with just the right level of detail. It’s responsive, intuitive, and seems to understand the flow of your work better than ever.
### Adaptive Behaviour Learning: Anticipating Your Needs
JAWS has learned over time what works best for you—like knowing when you prefer concise summaries over detailed descriptions or when to read full email threads aloud. Today, though, as you work through complex calculations in ResearchHub, JAWS picks up on repeated actions, noting your frequent need to access specific data fields.
Without you having to prompt it, JAWS speaks up, “Lottie, I’ve noticed you’re navigating back and forth to the Analysis Settings panel. Would you like me to create a macro for this?”
“Yes, that’d be great,” you reply, surprised at how quickly JAWS anticipates these needs. It assigns a simple command, Control-Alt-S, making it even easier for you to access the settings. With each task, JAWS quietly observes, creating personalized shortcuts and learning how to refine your workflow without interrupting your focus.
Your screen reader feels less like a tool and more like an assistant that adapts to your habits, reducing unnecessary actions and helping you move seamlessly between applications. You take a moment to appreciate the leap from manually scripting these shortcuts to having them generated in real-time, tailored perfectly to your unique style.
Dynamic Accessibility Adjustment: Visual Recognition on the Fly
Halfway through the day, you open a report in a new format. The document is packed with complex graphics, diagrams, and untagged elements—historically a nightmare for accessibility. But JAWS, equipped with advanced AI-powered visual recognition capabilities, is ready.
“Diagram detected: This appears to be a bar graph comparing quarterly performance,” JAWS announces, automatically analysing the content. “Would you like a detailed audio description, or should I just provide the key values?”
“Let’s go with the key values,” you respond, eager to save time. In seconds, JAWS summarizes the data, translating it into accessible content without needing additional third-party support. When you encounter z buttons in another application, JAWS instantly identifies them and provides real-time labels, adjusting the accessibility on the fly.
The thought crosses your mind how revolutionary this is. You’ve moved past needing someone else to make documents or software accessible for you. Instead, your screen reader adapts and scripts the solution independently, as if it’s actively learning how best to support you.
A Collaborative Community of Scripts
As the day wraps up, JAWS asks, “Lottie, would you like to share the custom scripts I created for ResearchHub with the community repository? Other users might find them useful.”
“Yes, please,” you reply. Knowing that the scripts you and JAWS have tailored today could now benefit others brings a sense of community to your day. In the past, each user’s customization stayed personal, but today, JAWS’ community sharing feature allows anonymized scripts to be uploaded to a shared repository, where other users can download them for similar applications. This feature isn’t just a convenience—it’s a small way to contribute to something larger than yourself.
You smile, thinking about the ripple effect of this community effort. As JAWS users across industries contribute their self-generated scripts, the database grows, improving access for everyone.
Reflecting on Progress: A New Kind of Independence
As you finish your work, JAWS reads aloud your notifications, wrapping up your day with a recap. You reflect on how far technology has come since those early days of assistive devices. Back then, using a screen reader required you to work around its limitations, painstakingly scripting or finding ways to access inaccessible software. Today, your screen reader does the heavy lifting, allowing you to focus on your work without the constant barrier of inaccessible content.
Looking back, you remember those initial frustrations, the hours spent tinkering with manual scripts, and the reliance on tech support for inaccessible programs. Now, JAWS’ AI-powered self-scripting has not only given you more control but also reinforced your independence. It’s not just a tool—it’s a partner in productivity.
As you power down, you realize that technology has not replaced your determination; it has amplified it. JAWS has become a proactive assistant, predicting your needs, adjusting to your habits, and making the inaccessible accessible. With the day’s tasks complete, you feel a renewed sense of autonomy—knowing that the tools at your fingertips truly work for you, enhancing not just your productivity but your entire work experience.
The screen fades to black, and the AI’s voice recedes, leaving you with a quiet appreciation for a world where technology supports your strengths, not your limitations.
#Accessibility #AccessAssistive #AI #AssistiveTechnology #Blind #Disability #JAWS #ScreenReader
Hey fellow #BlindMastodon users! I'm looking for recommendations on two types of writing tools for Windows:
1. Accessible text expanders
2. Word prediction software
I use both #JAWS and #NVDA screen readers. I tried Espanso (an open-source text expander), but it didn't work well with NVDA. I also tested Lightkey for word prediction, but it didn't seem accessible.
As an #ActuallyAutistic person, word prediction would really help with my autism-related communication challenges.
Have you found any text expanders or word prediction tools that work well with screen readers? What has your experience been like? I'd love to hear your recommendations and thoughts!
#Accessibility #AssistiveTechnology #TextExpander #WordPrediction #ScreenReader #Windows #AutismAccommodations #autism #blind @mastoblind @main
Check Out my Latest Guide I wrote for @iaccessibility on Ray-Ban Meta Smart Glasses!
Discover how these smart glasses are redefining accessibility for blind and visually impaired users with AI-powered features.
iaccessibility.net/ray-ban-met…
#Accessibility #RayBanMeta #SmartGlasses #AssistiveTechnology #VisuallyImpaired #Innovation #TechForGood #Inclusion #WearableTech
Not sure if I've talked about this here, but there are like, tons of stuff here. I'm gonna try that Linux distro through Crostini on my old ChromeBook that I don't do anything else with anymore. The VM's over the Internet are really cool too! The AT Museum is cool. And I'm curious to see if the Windows 2000/XP on **Android** works. That'd be trippy.
#accessibility #blind #AssistiveTechnology #Linux #gaming #TTS #VirtalMachine #foss #Android
On Tuesday, May 21st, from 1:15 pm to 2:30 pm EST, Aira’s CEO, Troy Otillio, will be speaking at the United Nations panel: "Promoting Inclusive Ageing Through Technology: Collaborative Perspectives and Cases."
As the leading company in visual interpreting, Aira has consistently expanded to meet the needs of its growing user base through product innovation and partnerships based on global accessibility initiatives. Panel inclusion among leaders in the accessibility industry highlights Aira’s impact for the blind and low vision community across all walks of life. At this panel, CEO Troy Otillio will share Aira’s innovative approaches and experiences in leveraging technology for accessibility.
Join in person (New York, New York, UN Room CR2) or tune into the UN Live Stream to join the conversation on promoting inclusive aging and advancing the rights and well-being of older adults across the globe.
webtv.un.org/en/asset/k17/k17i…
#AccessForAll #OnYourTerms #AssistiveTechnology
Promoting Inclusive Ageing Through Technology: Collaborative Perspectives and Use Cases
The fourteenth session of the UN Open-Ended Working Group on Ageing is poised to address critical issues surrounding accessibility, infrastructure, and habitat for ageing populations.UN Web TV
Aira Event Alert: Visual Interpreting at the Kentucky Derby!
🎉Join us for the 150th Kentucky Derby with visual interpreting by Aira! Following our successful streaming of the 2024 solar eclipse, which engaged over 500 listeners, we’re thrilled to continue expanding access for the blind and low vision community.
When: Saturday, May 4th, with Aira coverage from 2:30 pm - 4:15 pm PST
Where: Live on the Aira Explorer app (more/events) and via Zoom (access link to come). The event recording will be posted on our YouTube channel for post-event viewing.
Our visual interpreters will be describing everything from race details to jockey silks, symbolism to event history, and horse breeds to competitor profiles. We know the importance of experiencing events to their fullest and our team can’t wait to continue bringing visual interpreting to your event day festivities.
#InclusiveEvents #AccessForAll #OnYourTerms #AssistiveTechnology
Everette Bacon, Aira’s Vice President of Blindness Initiatives, was featured on the most recent episode of Talking Technology sharing Aira’s history, functionality, and future developments including the new Access AI feature and recent Eclipse event coverage.
#AccessForAll #OnYourTerms #AssistiveTechnology #Accessibility
youtube.com/watch?v=UAQ8YJA1Zb…
Talking Technology with V I Labs Episode 54 – Aira, Voice Dream, and a Solar Eclipse
On the show this week we catch up with Everette Bacon, Vice President of Blindness Initiatives at Aira Technologies to find out about the inspiration behind ...YouTube
Are you coming to the CSUN Assistive Technology Conference in Anaheim, California next week?
We're looking forward to catching up with everyone there! Don't forget to swing by Orange County 1-2 at 2:20 PM on Thursday 21st March for our session. Learn what is new and coming up in NVDA, meet our new CTO, Gerald Hartig, and have the chance to ask questions of the team!
csun.edu/cod/conference/sessio…
#NVDA #NVAccess #CSUNATC24 #CSUN #AssistiveTechnology #Conference #MeetUp #News #Presentation
t.ly/fO2WU
#PerceivePossibilities #Accessibility #AI #ArtificialIntelligence #LetsEnvision #AssistiveTechnology
Envision App 3.3 Launches 'Ask Envision' for Scanned Text & A Richer 'Describe Scene'
Every update is a step towards greater accessibility and empowerment.t.ly
Many of the improvements are performance-related, taking advantage of the cache of accessibility tree nodes maintained by the AT-SPI service. Table processing has received particular attention, and fundamental changes are underway in the code that handles users' keystrokes, some of which need to be interpreted as screen reader commands, with the remainder being passed through to the application.
I have been testing some of the changes along the way, as have other users active on the Orca mailing list. Rapid and precise bug reports continue to contribute to the development process. At this point, it is reasonable to expect these valuable improvements to appear in a release during the first half of 2024, presumably as part of GNOME 46.
#linux #orca #ScreenReader #AssistiveTechnology #accessibility #Gnome
docs.google.com/forms/d/e/1FAI…
#braille #BrailleDisplays #AssistiveTEchnology #accessibility
Braille Display Usage
This questionnaire contains questions about the usage of Braille displays in any life situation (education, work, personal use).Google Docs
Computers used to be so easy. The screen would give you a menu of options; you picked one using the keyboard. But then graphical interfaces came along and made things more complicated, especially for screen reader users.
What if any modern e-commerce website could be accessed using a simple menu interface? This is the dream of researcher David Cane.
Read the @NVAccess team's writeup of this intriguing session at #CSUNATC2023: nvaccess.org/post/in-process-2…
#a11y #AssistiveTechnology
#UserInterfaces
In-Process 24th March 2023
We’ve got a big issue for you this week, let’s get into it! CSUN Last week, the CSUN Assistive Technology Conference was held in Anaheim, California. The NV Access team were there to share informat…NV Access
Now you can listen to the talks of the CSUN Assistive Technology Conference 2023 directly on YouTube if you can't attend in person! 😀 👍
The CSUN Conference is an annual event organized by California State University, Northridge's Center on Disabilities. It brings together experts and enthusiasts from all over the world to discuss and showcase innovative assistive technologies and promote inclusion for persons with disabilities.
youtube.com/playlist?list=PLB7…
#accessibility #CSUN #assistivetechnology #a11y #AAC #Disability
2023 CSUN Assistive Technology Conference
Videos on this list may be hidden after the livestream while captions and video quality are reviewed. General Session videos will be hidden after the event c...YouTube