Search
Items tagged with: blind
Anyone using NVDA, have you found with 2024.4 beta 4, that reviewing Windows Terminal is very sluggish?
#accessibility #blind #travel
#Inklusion ist #Thema meines Gespräches mit Gabriela Hund von der
#Seelsorge mit seh-beinträchtigten Menschen der Ev. #Kirche in Hessen und Nassau.
Demnächst in der #Blindenhörzeitschrift "Das ABC-Journal" und im #Podcast des @komin@bildung.social
kom-in.de/kina-podcast
... übrigens, @gabrielahund@hessen.social würde sich bestimmt über eine ordentliche Begrüßung im #FediVerse und ein paar #Follower:Innen freuen
#blind #Sehbehinderung #Selbsthilfe #Behinderung #FediKirche
Facebook's useless #AI based photo descriptions have made it to Threads; and, because they have, #blind people have to suffer for it. People think that posting photos on Threads without alt text is ok because of the utterly useless auto-generated descriptions. I thought I was rid of it when I abandoned Facebook in 2017. Please ask your friends, followers, and acquaintances to describe #photos on Threads.
So like I just have to ask. For people that are super critical of AI for accessibility, what do you expect instead? Do you want blind people to have human ... guides or whatever that will narrate the world around you? Do you want humans to describe all your pictures? Videos? Porn? Because that's about the only other option. And you may return with "Well audio description." And I return with "You think people are going to describe every YouTube video out there? Or old TV shows like Dark Shadows?" Because honestly that's what it'd take. If AI were *not* around, if we want *that* kind of access, that's what we'd have to ask, of darn near every sighted human in the world. And I just don't feel comfortable with demanding that of them.
Now, we'll see what Apple does to give us what will hopefully be even better image descriptions. Imagine a 3B model that is made with **high quality** images and description pairs, trained to do nothing but describe images. Apple has done pretty darn good without LLM's so far, so maybe they'll surprise us further. But my goodness, I'd much rather have something that, yes, makes me *feel* included, maybe a tad bit more than it actually *does* include me. And that's for each and every blind person to decide for themselves if they want to use AI for image, and probably soon, video descriptions, and what they're willing to trust with it. But for us to get this much real, human access, I just hope people who are detracting from AI understand that we who use AI are now used to having images described, and well, soon videos. It's just something that I don't think people should just deny quickly.
#AI #accessibility #blind #description #video #llm
This also seems to happen when any add-on tries to download something that isn't from NV Access's site.
Any idea why this is happening?
@NVAccess #nvda #blind @mastoblind
An important nuance I haven't seen so far though is that even within a marginalized group, opinions can vary wildly about to what degree something is considered "a problem". The fact #twitch tried to get rid of "blind playthrough" in 2020 because it was considered ablist language is a great example of this: nobody in my circle thought this was problematic, we all had a good laugh and basically said they probably had bigger problems to worry about.
Now however, only a few years later, I see more and more sentiments shifting where that is concerned, asking writers not to use blind as synonymous for ignorant, stupid or incompetent. Same with terms like crippling debt. And as opposed to the Twitch example, this time it's actually #PWD who are making these points.
The question now becomes: Did times change, and did people get more offended by this / more hurt by this? Or is this simply yet another example of people finally coming forward about something that's irked them for decades?
I myself know where I stand on this, but I'd be a hypocrite if I decided to, in this case, decide everybody thinks as I do, where I normally always preach caution about homogenizing #accessibility.
This post doesn't really go anywhere, I just thought it was an interesting bit of contemplating :))
#tech #technology #Linux #accessibility @mastoblind @main
Okay, so I just have to rant about something that I think is going to hold the Linux Desktop back a bit, especially for blind users. That is config files. Specifically, editing some file that's got no kind of linter, or checker, to make sure that whatever the user put in it works and doesn't leave one without speech.
So, this is specifically about the Speech dispatcher, which is a package in Linux a bit like Microsoft's SAPI. Of course, there's that new one, Spiel, that's being worked on, but I've not tried it yet. So, you can run spd-conf to set up a config file, but a lot of folks tell me that, in order to increase the default speech rate for stuff like games that use speech-dispatcher, web apps that speak through speech-dispatcher, self-voicing apps, I'd have to edit the config file.
Now, that's not awful for me, I can do it. But there really, really needs to be a GUI for it. And if you ask *me* why I don't make it myself, I'll tell you that if it's so easy, you should have made it with the snap of *your* tallented fingers, instead of spending all that time being a reply-guy. Windows has a speech properties dialog. Mac has Spoken Content in System Settings. Linux should have something similar. And, if it turns out that there is and I just don't have the proper package installed, I'll gladly change my tune to "distros should have this installed by default."
I've also seen this in Emacs packages like Ellama, where you have to set up Emacs Lisp calls to the right function with the right parameters inside your usepackage block, and it's hard for me to figure out how to add the openAI-compatible service that I have to Ellama. And AI hasn't helped yet.
So, having things being customizable is amazing. It really is. There are so many keyboard configuration settings, for example, that I think blind people are going to love playing with. But goodness, there needs to be something to change your system TTS stuff, and hopefully one day we can have a command to turn on and off the screen, like the iOS screen curtain so we can maintain privacy.
#Linux #accessibility #blind #LinuxDesktop #foss #tts
A much more detailed post about TalkBack's new Image description feature using #AI #Gemini
Okay just, like right off the cuff here. But why does Vispero, a company making blindness software/hardware want videos for there Big Thing thing? Why not just text? Emails? Like, what? Why? Meh, whatever. I'll submit mine I guess. I'm sure some of you know what I'm gonna suggest. Feel free to submit your own ideas, goodness knows we need more grand ideas for screen readers.
#accessibility #blind #Braille #JAWS #FreedomScientific #ScreenReader
#blind #DiabloIV streamers out there, the dungeon assists are not out yet. Please, if your gonna stream this, know this ahead of time because its pretty concerning to see people stream for a bit til they get through to a dungeon, give up in the middle because they can't easily navigate it, and then attempt to move on or quit altogether, It was never going to be in in this pTR, Drew has only been attempting a solution for the past couple of months. I suggest skipping the campaign and doing overworld stuff and leveling up that way, and if you can't skip the campaign because you didn't complete the prologue, attempt to make your way through the dungeon anyway and make your way to Kiovoshad afterwards. The dungeon is trust me... not really that hard to navigate without the assists anyway, its a square-like dungeon and probably the easiest in the game to hit your way through. Where people got the idea dungeon nav was out yet I don't know, but it isn't at the moment. This is not directed toward anyone, but I see multiple streamers doing this and its a little bit concerning when expectations for this was just set for the overworld nav to be ready to go for October and nothing more until a solution for dungeons was implemented later on.
So, I'm genuinely curious. NVDA now has the ability to show text formatting in Braille. I don't mean through dot 7 and 8, or HTML-looking tags--although it can do those too now--but through the UEB, or whatever table you use, formatting symbols created for that Braille code. It also can now show paragraphs, either using spaces, or a Pilcro sign.
So, can JAWS do either of these? I'm seriously wondering, because people are *always* saying that JAWS' Braille support is the best in the industry. And I just want to make sure I'm using it the best I can. I also have Leasey, so if Leasey has features to help with that, @hartgenconsult I know there's the BrailleEase thing that I've not taken a look at yet. There's a ton of things I need to learn about Leasey.
For now, I know that iOS can show Braille formatting in the Books and Notes app, and the Mac can show it... in Safari as well I think. Linux... Well, Linux can't, besides the dot 7 and 8 stuff.
Braille formatting has really come far from the post I did like years ago, and I'm super happy about that. Hopefully it keeps going, especially in the age of multi-line displays.
Any #blind #nixos users out there? Any tips on how to install the thing accessibly?
I get to keep my work hardware so am thinking of playing with NixOS, but I can't launch Orca as part of the GNOME installation media. I've read things suggesting folks have done it, but neither the keyboard shortcut or alt-f2 "orca" after booting into the installer gets me any speech. Wondering if I need to load a full desktop first? I grabbed the GNOME media but maybe this doesn't boot into a full desktop? OCRing shows language/package selection but I can't activate the screen reader from here.
can anyone comment on what it's like to use @textualize interfaces as a #blind / Low Vision person?
I'd be seriously tempted to start writing my CLI tools in #python if I could make TUIs that worked decently for the visually impaired too.
Hi! Todaytt at #AccessibleApple
- Added support for Activity Pab, AKA Mastodon. Here are the list of accounts you can follow to get realtime updates from the forums.
@ios
@sitewiki
@announcements
@macos
@watchos
@other
@programming
I hope you enjoy.
#accessibilty #apple #blind #technology
For those who use IBM TTS on Linux. Get the IBMTTS dictionaries, copy them to /var/opt/IBM/ibmtts/dict, change files like this, from Enu-main.dic to main.dct. And then it should work. Do change the rest of those files too. Root and abbreviantion. Oh yeah enu-abbr.dic should be abbreviation.dct. So: abbreviation.dct main.dct root.dct.
#linux #foss #accessibility #IBMTTS #blind
respectability.org/people-with…
People with Disabilities at Work - Disability Belongs
A guide to achieving economic independence and inclusion through employment and entrepreneurship from Ollie Cantos and RespectAbility.Respect Ability
Define Besprechung
Learn using BigBlueButton, the trusted open-source web conferencing solution that enables seamless virtual collaboration and online learning experiences.bbb.metalab.at
This is not to suggest that automatic descriptions aren't useful. Several tools available to #blind people are now capable of providing a good idea of the contents of photos when no descriptions are available. Apps and hardware are able to analyze photos and videos for quick access to the environment when this wasn't possible short time ago.
Even though these tools exist, automatic #descriptions should not be a substitute for alt text.
Thread:
I've conducted several informal experiments over the last few weeks about alt text for #photos as described by humans and as provided by #AI systems.
#LLMS, despite providing plethora of details when describing images, still miss the nuances of what the photos contain. Human #descriptions certainly continue to be better at conveying context.
For this experiment, I asked several friends to send me photos and share #descriptions with me.
To give you one example, I was sent a photo of a table surface with a small coffee pot and a cup of coffee with foam on top. The table also had a plate of cookies, muffins, and croissants.
The #AI description described the drink in the coffee cup as a yogurt-based concoction. It also missed the cookies on the plate.
#accessibility #a11y #blind #photos #photo #photography