Skip to main content

GNOME Shell running out of the box on postmarketOS 23.06

Kudos to pabloyoyoista; GNOME and postmarketOS contributor / liaison :postmarketos::gnome:

He's made the GNOME experience as upstream as possible. We've discussed further improvements.

BTW, he's been looking for help

#postmarketOS #GNOME #LinuxMobile

This entry was edited (12 hours ago)

Currently playing the “will Iberia lose my luggage” game…
The hard part in a game of chicken is to know when to flinch
It seems I won, this time
A bag, sitting in the floor of the airport baggage claim
i never flew with iberia, how was the experience?
@jsalvador Typical for a low/mid-cost EU airline: small seats, and picky about the luggage in the cabin. Sadly, it means it's always a roll of the dice away from losing your luggage.
ugh, i thought they were better company. This makes me feel better about avoiding them for my next trip
LCG? Say hello to home from me :)

I think I might need to Greasemonkey yet another web app: Whatsapp Web. They've done silly stuff like give the message list role="application". They have at least added keyboard support, but they don't correctly set a role or label on the message items, plus you have to shift+tab three times from the message text box to get to the message list, which is ridiculously inefficient. This stuff is all trivial to fix. Does no one there test this stuff and think "hmm, if I were a daily user, this would really frustrate me"? #accessibility

Devin Prater :blind: reshared this.

Thanks. DO you know how well it works with Voiceover/Chromevox, screenreaders that don't do browse mode? We need it to work well for both, unfortunately.
@objectinspace I assume you mean Slack? I don't have personal experience, but I would imagine it would work fairly well. The user could likewise press up arrow/down arrow to move between the message list/text box and then use, say, the VO cursor to review the message if they wanted to dig deeper. The important point is making sure you focus the right thing at the right times; e.g. focus the text box as soon as you open a chat, but Whatsapp already does that.
@objectinspace And of course you want to make the label of message items as efficient as possible; e.g. put the time information last, that kind of thing.
Yeah I meant slack. I thought up arrow from the composer would make you select your most recently sent message for editting? Maybe it is just Discord that does that.

A swan on what looks like a collision course – taken as I was standing, apparently in a good spot for flying over, on a hill next to the lake it took off from. It was gaining altitude faster than it was closing the distance, so this didn't feel quite as perilous as it might appear!

#birding #birds #birdwatching #birdphotography #birdsofmastodon #photography #wildlife #wildlifephotography #naturephotography

A swan flying towards the camera, water and grass in the underexposed background.
This entry was edited (8 hours ago)

libei 1.0.0 is out! Finally!

Almost 3 years of development (admittedly with pauses), libei merged from a "hey, let's wrap emulating input into a library" idea into what's basically a point-to-point transport layer for logical input events. With the pending XDG Desktop Portal and compositor changes we'll be able to emulate and capture input in Wayland environments. And do much more, probably, there's already interest for accessibility implementations.

When making adaptive apps for GNOME, it's important that you don't think of smaller sizes as being a mobile mode. You still need to account for desktop users that simply want to use a smaller window - like if they're using a small laptop.

Be mindful that all your UI elements are still accessible with the pointer and via keyboard navigation.

yes! I’m very often using applications with their smallest possible width. Fractal, Tuba and Virtual Machine Manager come to mind.
This is so much better now that libadwaita has containers that don't suck! I have terrible memories of trying to fit Evolution's dialogs in netbook-sized screens for Meego 🤪

@matrix often have this issue on my iPad (even after reinstalling). Messages get through just fine (can read it on my iPhone). Been going on for months now :(
Matrix bugged
the 'Report bug' button in Settings

Sorry, neither #Linux nor #Firefox are supported by this provider if you want to see your psychiatrist via telehealth. Luckily, my doctor was willing to use an alternative encrypted chat platform. #ADHD
LifeStance HEALTH — Sorry, your browser is currently not supported. To ensure the best experience, please use one of the following browsers: 
* Windows and Mac: recent version of Google Chrome - highly recommended 
* Mac &iOS: recent versions of Safari 
* Windows and Mac: Microsoft Edge (latest version) 

JavaScript must be enabled, and cookies must be allowed.

OK so for those of you who are like me from the past and don’t listen to conferences, tech talks ETC but are actually interested in #programming I want to show you one feature of #Swift and/or #SwiftUI which actually blew my mind because of its simplicity. Before the #iOS 17 era you would do something like
class User: ObservableObject {
@Published var firstName = "Nuno"
@Published var lastName = "Mao"

struct ContentView: View {
@StateObject var user = User()
var body: some View {
Text("I am \(user.firstName) \(user.lastName)")
Then, in subsequent views you could pass this class around by using `@ObservedObject` property wrapper so that all views could stay in sync and point to the same data. However now they made it to work like that:
class User {
var firstName = "Nuno"
var lastName = "Mao"
private var eMailAddress = ""

struct ContentView: View {
var user = User()
var body: some View {
Text("I am \(user.firstName) \(user.lastName)")
This code brings some interesting implications with it, the least of it being that we no longer need to mark our properties with the `@Published` property wrapper. Basically all publicly exposed properties are marked as `@Published` automatically without our further intervention. `private` properties stay untouched, like in the example above.
The problem starts when we want to make our data actually changable like so
TextField("Name", text: $user.firstName)
TextField("Last name", text: $user.lastName)
Funnily enough, it won’t actually work because “Cannot find '$user' in scope”. Or, maybe? They made it actually very clever, to minimize overhead of code generated at compile they they once again force us to use a property wrapper. One small change and our properties can be bound to stuff

struct ContentView: View {
@Bindable var user = User()

That’s all, stufff will now work as expected.
#programming #apple #iOS #macOS #WWDC

Of course that’s just beginning of the great Iceberg that is Swift UI’s changeling for 2023. However there are some other neat stuff, for example #Xcode making its previews fully accessible for #blind #VoiceOver users! If you want me to write more about all of this and more, let me know. BTW,, Paul Hutson from HackingWithSwift is my idol. Without him I wouldn’t be able to write and understand even 5% of what I write and understand now.

Devin Prater :blind: reshared this.

You will want to add @State to the `var user` in your iOS 17 example so that a new instance of `User` is not created each time #SwiftUI recreates the view. This is explained in

The new ".zip" domain is being used almost solely for malware. Some of the clicks are very deceptive, even to technically knowledgeable people. See the attached image for an example.

You can block all zip domains with the following uBlock Origin rule under My Filters:


Tell everyone you know.

Two URLs which look like a legitimate download of a zip file. One is legitimate, one deceptively takes you to a domain on the .zip TLD.
This entry was edited (7 hours ago)

Hey guys, just a quick update on the training guide for Piper. Sorry for the long Delay on this, the notebooks keep getting updated with new features every few days and I thought it would be best to hold off publishing anything until that process has completed. We now have a way to test the created models before using them with NVDA, as well as a better way to actually export your models. I will need to test everything tomorrow just to be sure that it works as designed, however when that's done I will update the guide and publish it by the afternoon at the latest. Thank you so much for your patience. This field is evolving very fast but it's all really exciting.

If people are so ready to return to the office then why do jobs keep lying about being remote?


This piece captures so much of my own thinking about generative AI in the context of my writing classes. I think there are productive ways of using the technology, but as a cognitive scientist, I’m very skeptical about the claims I’ve seen that eliminating “busywork” will open up space for students to do higher level work. To be clear, I do think that it could, perhaps most obviously for assignments that require writing but aren’t about learning to write.

The only way to build skills is to do the work, to practice, and unfortunately “practice” often feels like “busywork” to students if we instructors aren’t thoughtful about creating meaningful arenas for practice.


So that is what we need to be doing as instructors: designing opportunities for students to practice skill-building in ways that do NOT feel like the kind of busywork best avoided by turning to ChatGPT. I was recently re-reading James Lang’s “Cheating Lessons” book, and though it was written a decade ago and as such doesn’t say anything about generative AI, I actually think it’s a GREAT reference for how to create these kinds of assignments.


So I restarted my Discord client and got notified that I can switch to a proper username without the discriminator (the numbers after the username). I was able to get my prefered username and within about 2 hours, I have started getting harassed by a guy that wanted it. His account was created 2 months ago, mine was created September 2015.

So yeah, this whole Discord username switch is going to be a complete clusterfuck, be prepared!


Transgender friends -- trying to find a swimsuit for my daughter. Advice, recommendations? She wants something feminine but fairly modest, and wears around a 3X.

Please boost, if you have the right audience.

Tech speakers, it's 2023. Stop using moms as your example of a non-technical audience. It's wrong, its not funny, and whatever you were saying, now most of your audience is not thinking about it.

Just use the exec team as an example instead and get on with your life.

reshared this

And dudes, if you think this is a small thing (it's not), then it should be a small thing for you to fix your shit. If you'd like to not have to deal with people being annoying when you do it, you can literally have that.

Everyone else, if you're in a position to do so without it being a risk for you, tell the speaker and the conference, every single time.

Also? Marketing? HR? Largely female-coded work, so you're doing the same thing, but now also showing that you don't really respect your colleagues. And yes, they are actually your colleagues, and they do notice. Punch up. The CEO will laugh too, and he can take it.

If he can't, you really want to change jobs anyway.

It's weird to me that Apple is sunsetting Macs made as recently as 2017. If you are buying a new Mac to stay on the latest and greatest software, please don't landfill your current Mac. Donate it to a FreeGeek or similar organization. An "unsupported Mac" can continue as a valued computer running Linux for many more years beyond "the sunset".

In honor of #WWDC23 I updated this bug demonstrating Safari 16.5 on iOS and macOS breaks tables to unusable when CSS display properties are applied:

I have been tracking Apple’s ongoing failure to fix this #a11y bug for 5 years:


LOL! Just found that I mistyped my Summer Reading folder as "Sumer Reading" and now I'm thinking of a summer of talking about Gilgamesh and teaching people to read cuneiform.

So, I've been working on this report on commercial self-generated CSAM for a few months. This is an area that is not very well studied, and also poorly handled by many platforms. While Instagram is the focal point of the report, this affects a very wide swath of the tech industry, from payment services to e-commerce to social networking tools. Since some news sites are attributing every conclusion they've drawn to "the researchers", please read the actual report:

Same goes for other online services, really — I increasingly think we need to reëxamine how we think about sexual content online, with consent being the top priority. I might even go so far as to say that consent should be revocable when it comes to intimate imagery, which is not something any law or policy has addressed to my knowledge. Just some food for thought.
This entry was edited (8 hours ago)
Now, a note on the Fediverse: SG-CSAM is not really a thing on here (other kinds are — thanks Japan), but the reason is simple: the Fediverse isn't popular enough to make it profitable. So don't gloat about this just yet, there are massive T&S issues that the Fediverse is going to get hit with that it is extremely ill-prepared for. As far as I know, no instance even has table stakes CSAM protections. Get on it.
This entry was edited (8 hours ago)

I am asked to look into how to become an Apple support specialist. Can anyone tell me what that looks like as a blind job seeker?


With grim determination, Earth threw up a bubble, wrapping all of them in as tight a shield as she could manage. She anchored it in the ground, reaching down to firm rock.

“Gennie, if you have any energy to spare, help hold that bubble, please.”

“Of course, Momma Earth.”

She could sense the girl’s smile, and somehow, that gave her hope.

Earth Splits, Pillars of the Empire Book 1
(Finally making progress!)


My usual rule of the thumb is, if I don't think media I post is worth writing alt text, it's probably not worth posting or boosting, if needed to share but lazy I could post it in one of my chats or dm to friends

A family member had COVID about 6 weeks ago - they now have concerning high blood pressure, dizzy spells, and double vision to the point that they're wearing an eye patch, switching it between eyes to cope. I'm trying to gather some studies/info to pass on, they have appointments this week to figure out what's going on. They're kind and receptive, but definitely won't be considering that COVID might be playing a role - can anyone help lead me to the correct info to share with them?

Nothing to see here. I'm just a girl dreaming of the day we get separate sounds for different groups in Discord.🎸😇

JJ and I are two of the co-hoasts for an audio-only experience of LuminatoFestival events including live, on-location audio description of on-the-street events in realtime. Listen live here starting today, June 7, at 6:00 p.m. Eastern. Today's live coverage starts with a walk with #LittleAmal. She's a 12-foot tall puppet, she's a 10-year-old Syrian girl, looking for her parents, looking for a home. Listen as Audio Describers give us the visuals.

AXObserver is one of the worst APIs I’ve ever seen out of Apple.

1. There are no proper constructors on the class, so you have to call static methods (that aren’t even defined on the class) to create a new instance.
2. Said functions don’t return the object to you. You have to create the object, and then pass it as a pointer, checking the return value of the create function for error codes! What? This is Swift, not Objective-C, how about you get the hell away from me with your out pointers and error codes?
3. They’re required to be locked to a specific process. This means I can’t observe system-wide accessibility events without iterating through a list of visible processes and installing an observer for each and every one of them. This could be made more efficient by only doing it if it has foreground focus, but who in their right mind thought this was a good idea?

Someone please tell me there’s at least a better way to observe accessibility events system wide? Please? I can get past the unfriendly API, but not… that.

Devin Prater :blind: reshared this.

Isn’t that an old (Carbon) API, from the days of Mac OS 9, C/C++, Pascal and no native object orientation in the OS?
@miki Available since 10.2, so pretty old, yeah.

Here it is. a demo of the Personal voice IOS 17 feature reading a simple piece of text in my voice. Share and enjoy.

Devin Prater :blind: reshared this.

Dear Classmates--yes it's still a thing, I might actually be tempted to open an email from you. Okay, not! But getting my graduation year correct would be the first step to convince me that I need to open one of the daily spams you send me.

Once #WWDC23 goes live, I'll have my usual thread of #accessibility related content here. I used to do this on Twitter. It involves discussing the announcements as well as testing of features announced prior to and throughout WWDC. I will welcome questions and requests for testing. I think I'll like doing this here much more than I did on Twitter. Oh and this thread will probably go on for months. I'll post things periodically as updates come.


I know you want to try the latest and the greatest. But you also want to be completely sure that what you need day-to-day will work on your device. So be absolutely sure that you're not going to need your phone, or your iPad for important work or something else. You'll here me talk about possible bugs and behaviors along with testing notes on this thread. Being a rather technical user, I know how to get out of situations that I put myself in.

So, Just be careful!

#WWDC #WWDC23 #Accessibility

Oh, and don't expect me to live-toot or something like that. I might occasionally pop-in and say something that interests me about #WWDC. A lot of #accessibility announcements have been prebaked when Apple announced all the major changes and features coming to their operating systems in May. If I see something that will impact accessibility, I'll note it during the keynote or, if I think of it, after the keynote.


All right. #WWDC is over and here's a general thought that I'm left with. The #accessibility story for the #VisionPro product is going to be a little mixed until we get to hear more details about people's experiences. That said, if Apple can actually deliver what they said they'd deliver, accessibility story for some people become a lot more intriguing. Partially sighted people may find this device far more useful than a phone.


Having flexible gesture based controls could be much more useful disabled people with motor and/or some physical disabilities. As game controllers and other devices such as bluetooth keyboards / pointers will be available, switch controls will also probably work. And eye based navigation will work quite well.

This, of course, poses some significant concerns for #blind people who will not be able to interact using eyes. Even authentication will be a challenge.

#WWDC #accessibility #WWDC23

So. I'm waiting for Apple to answer some of those questions. For #blind people, performing finger based gestures will most probably work the same as it does with devices with touch surfaces. Since, any surface will do, gestures could be performed without having to worry about the device.

Nothing to say that the #accessibility story for the #VisionPro device may not be as mature as we think it will be. We'll just have to wait and see.


A bit of a concern for physical/motor disabilities: We have to consider the initial step of putting the device on. It may not be doable independently.

Some of the ergonomic issues also can be either exacerbated or could be relieved by the interface shown in #VisionPro.

Keep in mind that these are a lot of random thoughts just based on what I heard in the Apple keynote. There will be a lot of discovery over the week as we see APIs and discussions.

Stay Tuned.

#accessibility #WWDC #WWDC23

As a blind person, what excites me just by watching this is the strong possibility of having apps such as Envision, Seeing AI and others take advantage of this new #VisionPro platform to bring their apps on it. It's far more powerful than anything that we've seen before. It erases a lot of challenges with phone cameras.

#WWDC #accessibility #WWDC23

All of this excitement aside, the device is certainly going to be bloody expenseive. So whether disabled people will be able to afford this will be another question entirely. The question will be whether to purchase stand-alone hardware like Envision's glasses or #VisionPro with the possibility that apps like Aira,, Seeing AI, and others will show up here. A device that allows for consolidated #accessibility experience or something that stands alone?


On the other hand, Apple updating Airpods to make them more useful for #accessibility purposes is extremely good. Adaptive hearing based on the environmental sounds will be quite useful for hearing health as well as quite a few deaf and hard of hearing people. Apple is silently making Airpods Pro their true hearing show piece.

Note that I'm not sure whether these features will be limited to Airpods Pro yet. More when I know.


Now, let me see if I could take a look at some other iOS/Mac OS features that could benefit #accessibility.
The live voicemail transcripts for phone calls, which allows you to review voicemails as they are being recorded are great for everyone. But better yet, for #accessibility. Transcripts also come to voice messages. Great for Deaf/HH people.

Better keyboard suggestions through ML--note Apple never said the word #AI, will enable speedy typing.


I'm choosing to see the new Airdrop features as a major #accessibility winn in iOS/iPad OS. Lot of the new features are based on convenience. And they work quite nicely to enhance accessibility.


This is something new. I'd never expected Eloquence on iOS 17 to have a lisp. Lol.

#accessibility #WWDC #WWDC23

Aparently you can choose per voice settings from actions on a voice and change a lot of params.

I see a lot of people ignored my earlier warning about not downloading Developer betas for iOS, iPadOS, Mac OS on their primary devices. I swear I might mute you if you complain too much. *Sigh!*


Either Downloading Siri voices takes a long time on iOS devices or the functionality is broken in iOS 17 Dev beta 1. Trying to test Siri Voices for Voiceover since the neural voices are supposed to be available rather than natural voices.


7 #accessibility developer sessions at #WWDC this week. Woe. Seems like we will find out a little about #VisionOS's accessibility since there is a session that will show developers about how to build accessible experiences. This answers some of the questions. We will have access to that new platform. since iOS and iPadOS frameworks are built-in, I was pretty sure.

Sessions will also sshow devs how to adapt the new Assistive framework and more.

Brandon reshared this.

Lots of little buggies to file. Guess what I'll be doing while my body refuses to get sleep?

#accessibility #WWDC

Before I go do some other things like take a break, I'll end on a positive note for tonight. So far, I haven't seen the Braille panning bug that was in iOS 16.5 I also haven't had an eloquence crash, though it's pretty early for that.

I fully intend to test Braille. So let me know if there's something specific you want me to test, not only Braille but other things as well.

#accessibility #WWDC

Couple of notes on iOS 17 and speech:

The lisp that Eloquence is displaying is related to the higher sampling rate that's now configurable through voice settings. I believe I have it set properly to let me listen comfortably.
For Eloq, each voice can be configured settings such as pitch, pitch offset, head size, etc.

#accessibility #WWDC23 #WWDC

Devin Prater :blind: reshared this.

Eloquence's dictionary issue still exists. Navada Access will continue to be a thing for now. Let's see if we can change this. These settings also exist for Eloq on the Mac.

Settings are also available for Vocalizer voices. The voice must be active or in language rotor to configure. These settings are related to things such as pauses.

#accessibility #WWDC #WWDC23

Devin Prater :blind: reshared this.

The instability I'd reported for Siri voices was nothing more than growing pains for the initial dev beta release. Some people might have seen the original Siri voices that were incredibly large, over 350 MB in size. After a little while, the new Siri voices showed up for download. As far as I can tell, these are all under 70 MB in size. They are quite responsive. I've tested the American ones for now.

I'll create audio demos later in the day.

#accessibility #WWDC

I still get the large Siri voices displayed in VoiceOver settings, and some very much smaller ones in the Spoken Content section. So for me, at least,the old Siri voices are still being used. And from a brief try out, it also sounds that way to me. So whatever is supposed to happen, is not happening yet for me. And I've had the beta installed for a few hours now.
@Marco Give it a little while. There could be a couple of things going on. Are you testing with American voices or other voices?
Both. I have both an American English and a British English entry in my language rotor. They both use Siri voices. And I know that only the American voice seems to be more expressive in contexts other than VoiceOver, the British one is of the slightly older versions that Siri voices 1 and 4 in U.S. are, not like 2, 3, and 5.
@Marco So far, I've been able to download the American Siri Voices and not any of the other languages. I had to delete all my American voices first. There's a lot of fluctuation as to what's going on.
Indeed. They magically appeared for me now as well. There really seems to be a lot going on back and forth between the devices and Apple servers.
@Marco Did you get any of the other language voices? Or was it just American ones?
It was just American ones. Like I said, the more modern voices. The other languages never got the boost the American voices got in terms of expressiveness, diversity, and other traits.
@Marco This is a good point. I'll file a request any way. I know it will be ignored for now.
@Marco I'm wondering if Apple is provisioning them based on request. So.

As I suspected, Voiceover will be on the #VisionPro device and we'll be able to use the finger as a pointer rather than eyes. Now we just need to know how authentication will work.

Speaking of authentication and signing into devices, one of the features that Apple didn't discuss for iOS/iPad OS was the ability to signin to a new device if you have nearby devices already signed in.

#accessibility #WWDC

I'll be eager to watch tthe #WWDC session that discusses the ability for developers to conduct accessibility audits in their own apps. It's one of the 7 #accessibility sessions.

The full #accessibility story for the #VisionOS and the #VisionPro initial device is far more impressive than I'd imagined. Then again I shouldn't be surprised. Even if Apple has a checkered past about maintaining some of the features properly as they should, they do have an ethos of making products fully accessible from the start. The list of accessibility features supported out of the box on these new devices is in the next post, which I gathered from the State of the Union.


Full list of #accessibility.

Audio Descriptions
Support Per-App Settings
Dynamic Type
Reduce Transparency
Reduce White Point
Color Filters
Bold Text
Voice Control
Spoken Content
Background Sounds
Pointer Control
Dwell Control
Button Shapes
Accessibility Shortcut
MFi Hearing Devices
Subtitles and Closed Captions
Switch Control
Full Keyboard Access
Image Descriptions
Guided Access
Reduce Motion
Left/Right Balance
Mono Audio


Looking through a couple of interesting API additions from #WWDC related to #accessibility.

If your app has a custom video player--I'm looking at you the app formerly known as HBO Max, you can automatically detect flashing lights and dim them.


The more I use iOS 17 and iPadOS 17, the more I suspect that the automatic image descriptions feature has gotten an upgrade. I'm getting better descriptions than I was in iOS 16. I don't think it's my imagination. Could others confirm this? I'll have to set up scenarios where I get descriptions from the same source with both OS'. I think this could be fun or tedius. For now, I'll take it as fun.

#accessibility #WWDC #WWDC23

Yeah, they are a little better. Still gets caught on "a screenshot of a videogame" for a lot of images of, yes, video games, but it now describes them a little sometimes.
@devinprater Does it still say random stuff like cigarette, dice when reading text?
And, I just watched the session that looks at #VisionOS #accessibility details. It's really really good. I'll come back with notes.
I just watched that one, too. And I am super excited.

Apple is truly making the betas available to everyone. So I'll remind you again not to install betas, no matter how stable you think they are, on primary devices. That said, iOS 17 betas are remarkably stable. Things could change. I've seen things breaking down in subsequent betas based on what I've seen. I'm not surprised about the stability in this release since most features are nothing major unlike iOS 16. Same re iPad OS and Mac OS.

Just be careful!

#accessibility #WWDC #WWDC23

Allright then, Here are some details about how #accessibility will work on #VisionPro.

#Voiceover has its own gestures similar to iOS.

* Pinch with thumb and index finger with the right hand to move forward (equivalent to flick right).

* use the thumb and the middle finger to pinnch in order to move back (equivalent to flick left).

* Pinch with thumb and ring finger on the right hand or thumb and index finger on the left to select (equivalent to double tap).

#accessibility #WWDC #WWDC23

Brandon reshared this.

The crown will allow you to assign a shortcut to the #accessibility option of your your choice. Tripple tap it on the #VisionPro device to launch.

Reality Kit has a new accessible component, which allows the assigning of accessibility properties. There are custom labels, values, and traits as well as custom rotors, custom actions, and custom content. Activate and ajustable system actions can also be assigned.


As I'm trying to simulate these gestures by putting my hands on my lap, turning up my palms, and bringing various fingers together, I'm finding the interaction quite natural and faster than on a slab of glass. Since I don't have to hold anything, it could be very efficient.

#Accessibility #WWDC #WWDC23

Switching on Voiceover on #VisionPro turns off normal gestures to avoid the system confusing the user. There is a new direct gesture mode, however, that will enable developers to allow #blind people to interact with apps and actions. We'll be able to choose if we want to invoke the direct gesture mode. If an app is developed well, this will be like direct touch. I suspect it might be used more here though.

#accessibility #WWDC #WWDC23

Developers will need to do a couple of things differently or provide additional information like spacial awareness to Voiceover users on #VisionOS apps. It also includes gesture recognition announcements for direct gestures. System actions will allow additional capabilities. It will be necessary to make announcements about important events in your apps such as entering a room, or object showing up and its position. Think games.

#accessibility #WWDC #WWDC 23

Voiceover gestures will also rely on multiple pinches, holds, and multiple fingers, all pretty seemingly simple for now.

Devs will need to be aware of dynamic type issues as well as contrast ratios.

New concepts for spacial computing. Anchors can be used to position objects relative to other objects, relative to hand, or other things in the world. Things can also be anchored to the camera so that it appears on the same spots oon displays. Content follow your head.

#accessibility #WWDC #WWDC23

Anchors are important for Zoom users. Depending on how they're used, they could impact partially sighted users. certain anchors will need to be avoided so that low vision users can get closer to objects with Zoom to identify and read. Positioning the Zoom lense to read head anchors might be difficult.

#accessibility #WWDC #WWDC23

Devs will need to be especially aware of motion in #VisionPro apps since reduced animation option is available for #accessibility purposes. Alternatives will be needed for reduced motion, zooming motion effects, rapid rotation effects, etc. APIs and notifications will be available. Using cross-fading could help.


#accessibility features include inputs to accommodate people who use alternative input. The Dwel control accessibility features allows an alternative set of gestures for motor/physical disabilities without using hands. It includes, tap, scroll, long-press, and ddrag. Think of this as no different than inputs like game controllers. Pointer control will also be available instead of eye tracking. Head movement, wrist position or index finger as alternatives.

#accessibility #WWDC23 #WWDC

To conclude this #VisionPro #accessibility recap from the video, there are multiple considerations for accessible apps including excellent quality captioning for spoken content. And spacial awareness of captions and sounds will need to be ensured by developers. In short, turn on these options through accessibility settings and see how your app behaves. Use SwiftUI frameworks if possible. Use accessibility audits when developing your apps. IN short, make your apps accessible.


Now to the feature that more than one of you have asked me about. That is, using the camera to read panels by pointing fingers at buttons on appliances.

Other than the fact that it's using the Arabic voice to speak in English, which is terrible, the feature works rather well. They need to iron out a couple of bugs there. In a way, it's quite funny. I'll try to do a couple of demos of these things tomorrow if I have time.

#accessibility #WWDC #WWDC23

Now that I've had a couple of days to think about #VisionOS and #VisionPro, I'm contemplating writing a detailed post on some of the implications for #accessibility. There were quite a few questions answered. Some still remain. What say you?

If I do this, I need to check on the status of my web site. Inaugural post, may be?


If you think Apple is not interested in what everyone is calling #AI, just look at the #WWDC sessions on machine learning. They've had these sessions for years. While Microsoft, Google, and Amazon try to sell their cloud infrastructure for #ML use, Apple's already got their platform in play and has done so for those multiple years.

#GenerativeAI #MachineLearning #GPT

I'm liking the changes made to the Sharesheet in iOS 17 this year. I wonder if the sharesheet uses ML to determine the best apps you share with and the context. So far I like it.

#Accessibility #WWDC #WWDC23

Alongside the new Android 14 Beta 3 builds, Google has also quietly released the first builds of Android TV 14 based on Android 14! Only emulator builds are available now.

Alongside this, Google seems to be retiring Android TV 13 (!)

This code change was spotted by 9to5Google. In an upcoming release of Android Studio Canary with the latest AVD Manager, when you select an Android TV 13 image, you'll see that "Tiramisu is an unsupported Android TV version."

I can now corroborate this. A source has informed me that Google informed Android TV partners a few days back that they would no longer certify builds based on Android TV 13.

They didn't really give a reason why they're discontinuing Android TV 13, just that they're increasing their focus on Android TV 14, which could be a big release for the platform.

I have reached out to Google for comment on this news.

Someone hold me back. Someone stop me. I am going to do terrible no good things to hCaptcha. I swear. Someone better stop me before it's too late.