Search
Items tagged with: innovation
Happy birthday to ARPANET, the forerunner of the modern internet! 53 years ago, the first message was sent over this pioneering network, paving the way for a world of interconnection and innovation.
As a tech enthusiast, I am constantly amazed by the ways the internet has transformed our lives, allowing us to communicate, learn and share ideas across boundaries and borders. From social media to e-commerce, from telemedicine to remote work, the internet has become an essential part of our daily routines, enabling us to connect with others and access a wealth of information at our fingertips.
#ARPANET #InternetAnniversary #Transhumanism #TechInnovation
#InternetHistory #Networking #DigitalRevolution #Innovation #Communication #Cyberculture #OnlineCommunity #TechHerstory #NetNeutrality #OpenSource #Decentralization #DataPrivacy #Cybersecurity #DigitalRights #Accessibility #Inclusion #SocialImpact #DigitalTransformation #FutureTech #TechOptimism #InternetOfThings #AI
🔗 ethw.org/Milestones:Birthplace…
It’s no secret that many of us in the blind community have embraced the rapid advances in Artificial Intelligence over the past two years. We've witnessed firsthand how these technologies can be a powerful force for good, especially within our community. AI-generated image descriptions have revolutionized how we navigate the online world, offering a perspective previously unimaginable. This impact is now undeniable, transforming how we interact with the world.”
I’ve declared the kingdom of the blind a republic—perhaps prematurely, but only by a small margin. With AI empowering us to perceive the digital world in new ways, we are no longer ruled by limitations, but actively shaping our future. Anthropic’s recent launch of ‘computer use’ marks the first steps into a new phase of AI evolution—one where AI agents begin to act independently on our behalf, initiating a shift in how we interact with technology.
As AI continues to evolve, so too will the Assistive Technology that many of us depend on. I envision a future where this intelligence becomes a true companion, guiding us seamlessly through both digital landscapes and real-world challenges. We may be just two years away from seeing JAWS, NVDA, or SuperNova transform into true Assistive Intelligence 1.0—or perhaps it will take a little longer. If AI has taught us anything, it’s that progress comes both more slowly than we expect and faster than we can possibly imagine.
What follows is my first attempt at describing how a screen reader of today could take the first steps towards becoming an Assistive Intelligence. If anyone wants to build it, I’d love to help if I can. Whatever you think, let me know what you think:
“Proposed AI-Powered Self-Scripting Feature for JAWS Screen Reader
Objective
The suggested feature seeks to integrate advanced AI-driven "computer use" capabilities, like those developed by Claude (Anthropic), into the JAWS screen reader. This functionality would enable JAWS to autonomously create and refine custom scripts in response to real-time user interactions and application environments. The aim is to enhance accessibility and productivity for visually impaired users, especially when navigating non-standard or otherwise inaccessible software interfaces.
Feature Description
The self-scripting capability would empower JAWS to analyse user interactions with applications, identify recurring actions or inaccessible elements, and generate scripts that optimize these processes. By enabling JAWS to perform this autonomously, users gain seamless and personalized access to applications without manual intervention, allowing for an enhanced, efficient experience.
The self-scripting feature will be powered by the following core functions:
1. Real-Time Autonomous Scripting: JAWS would use AI to observe user interactions with applications, especially non-accessible ones, and automatically generate scripts that improve navigation, label untagged elements, and streamline frequent tasks. For example, if a user frequently navigates to a particular form field, JAWS could create a shortcut to this area.
2. Adaptive Behaviour Learning: This feature would allow JAWS to recognize patterns in a user’s interactions, such as repeated actions or commonly accessed elements. JAWS would adapt its behaviour by creating custom macros, enabling faster navigation and interaction with complex workflows.
3. Dynamic Accessibility Adjustment: Leveraging Claude’s approach to visual recognition, JAWS could interpret visual elements (like buttons or icons) and provide instant labelling or feedback. This would be valuable in software with minimal accessibility features, as it enables JAWS to make live adjustments and effectively “teach itself” how to navigate new environments.
4. Community Script Sharing: Self-generated scripts, once verified, could be anonymized, and made available to other users via a shared repository. This would foster a collaborative environment, empowering users to contribute to a broader database of accessibility scripts for applications across various industries.
Value Proposition
This feature will address key challenges for visually impaired users, including the complexity of navigating inaccessible interfaces and the time-consuming nature of repetitive tasks. The ability for JAWS to generate its own scripts autonomously would mean:
1. Increased Accessibility: Improved interaction with non-accessible software interfaces.
2. Higher Productivity: Reduced need for external support or manual scripting, allowing users to accomplish tasks more independently.
3. Enhanced User Experience: Scripting and macro creation based on personal usage patterns -- leads to a more intuitive and personalized experience.
Technical Considerations
1. Performance: Processing real-time visual and user interaction data requires substantial computing power. A cloud-based model may be optimal, offloading some processing requirements and ensuring smooth, responsive performance.
2. Safety: Automated scripting must be closely monitored to prevent unintended interactions or conflicts within applications. Integration of safeguard protocols and user settings to enable/disable autonomous scripting will be essential.
3. Privacy: To ensure user data is protected, anonymization protocols and data privacy standards will be implemented. Data collected from user interactions would be handled in compliance with rigorous privacy standards, safeguarding user preferences and behaviour.
Conclusion
Integrating AI-powered self-scripting capabilities into JAWS would represent a significant leap in screen reader technology. By allowing JAWS to, when requested, autonomously learn, adapt, and script in response to user needs, this feature could provide visually impaired users with unprecedented control and flexibility in navigating digital environments, fostering both independence and productivity. The anticipated benefits underscore the feature’s potential to redefine accessible technology, turning screen reader into Assistive Intelligence.“
About the Author:
Lottie is a passionate advocate for the transformative potential of AI, especially within the blind and visually impaired community. She blends technical insights with a keen awareness of lived experiences, envisioning a future where AI doesn’t just assist but truly empowers. Her thoughtful reflections explore the shift from a "kingdom of the blind" to a republic, where emerging technologies like AI create new opportunities for autonomy and inclusion.
With a balance of optimism and critical realism, Lottie acknowledges the game-changing impact of AI tools like image descriptions while recognizing that more progress is needed. Her vision extends to the idea of "Assistive Intelligence," where screen readers like JAWS evolve into proactive companions, adapting to users' needs in real-time.
Known for turning complex ideas into actionable blueprints, Lottie is not just an observer of technological trends but a catalyst for innovation. Her proposals reflect a desire to elevate independence and productivity for blind users, pushing the boundaries of what's possible in assistive technology. Her insights continue to inspire conversations and shape the future of accessible tech.
I am the Blind AI, relying on AI every day to enrich my life. While my posts may occasionally benefit from AI assistance, the thoughts, perspectives, and final edits are entirely my own. AI is my tool, much like a calculator or spell-check, refining my expression but never replacing my voice.
#Accessibility #AI #AIsoftheBlind #Blind #ComputerVision #Disability #Innovation #JAWS #NVDA #ScreenReader #SuperNov
Check Out my Latest Guide I wrote for @iaccessibility on Ray-Ban Meta Smart Glasses!
Discover how these smart glasses are redefining accessibility for blind and visually impaired users with AI-powered features.
iaccessibility.net/ray-ban-met…
#Accessibility #RayBanMeta #SmartGlasses #AssistiveTechnology #VisuallyImpaired #Innovation #TechForGood #Inclusion #WearableTech
Being #OpenSource has many advantages. For #NVDA has opened the way for community contributions, and has enabled #transparency, #security and #innovation beyond what might have been possible in closed software. Increasingly, governments are also mandating the use of open source. Here is an article on such a step forward in Switzerland:
"Switzerland Makes Open Source Software Mandatory For Public Sector"
news.itsfoss.com/switzerland-o…
#FOSS #NVDA #NVDAsr #Accessibility #Software #News
Switzerland Makes Open Source Software Mandatory For Public Sector
A big boost to the open-source community and an inspiration to other public sectors!Sourav Rudra (It's FOSS News)
Aira Integration: Meta Ray-Bans and ARxVision Smart Glasses Pilot 👓
You've asked, and we've listened! As we begin the National Federation of the Blind 2024 convention, we are excited to announce pilot programs featuring both the Meta Ray-Bans and the ARx AI Gen1.5. By integrating with the latest technology, we aim to provide even more options for convenient, innovative, and fashionable visual interpreting!
Check out our new blog post for more details and information on how to join the pilot programs: aira.io/new-wearables-pilot/
.
.
.
#NFB24 #Aira #SmartGlasses #ARxVision #MetaRayBan #Innovation #AssistiveTech #VisualInterpreting #BlogUpdate #Accessibility
New Wearables Pilot Programs Launch!
Aira is launching two innovative new pilot programs with Meta Ray-Ban smart glasses & ARx glasses. Learn how these programs work and get involved.Hannah Griffin (Aira)
We often think of Philanthropy purely as donations by the rich, but it is something we can all do. Philanthropy is broader: "the desire to promote the welfare of others". How can you promote the welfare of others this week? Do you know anyone who might benefit from knowing about NVDA?
#pw24 #PhilanthropyWeek #philanthropyawards #Philanthropy #Queensland #Giving #Innovation #NVDA #ScreenReader #Accessibility
In June 2023, Mick & Jamie, NV Access founders, were honoured with the Philanthropy Innovation Award by Queensland Gives. Last night, we were invited back for the launch of Philanthropy week 2024. We are pleased to share this photo of James, Mick and David Oliver (CEO Speld QLD).
#pw24 #PhilanthropyWeek #philanthropyawards #Philanthropy #Queensland #Giving #Innovation #NVDA #ScreenReader #Accessibility
In Spanien wird mit dem Silence S04 ein elektrisches #Nanocar als umweltfreundliche Alternative zum fossilen #Kleinwagen entwickelt. Leicht und effizient, verbraucht er weniger Energie und ist flüsterleise. Mit einer Reichweite von 149 km und herausnehmbarer Batterie bietet er Flexibilität im Alltag.
#Elektromobilität #Nanocar #Klimaschutz #Verkehrswende #SilenceS04 #Umweltfreundlich #Innovation
taz.de/Elektromobilitaet/!6005…
Elektromobilität: Im Kleinstwagen zur Verkehrswende
Leichter, effizienter, elektrisch: In Spanien wird ein Nanocar als Gegenmodell zum fossilen Kleinwagen entwickelt. Eine Testfahrt.taz.de
locusmag.com/2023/12/commentar…
#AI #EconomicSpeculation #innovation #technology
Cory Doctorow: What Kind of Bubble is AI?
Of course AI is a bubble. It has all the hallmarks of a classic tech bubble. Pick up a rental car at SFO and drive in either direction on the 101 – north to San Francisco, south to Palo Alto – and …Locus Online