Skip to main content

Search

Items tagged with: AI


I saw a post on Facebook the other day. Someone, probably a student, but maybe a child, was asking if they could use AI to format an assignment. The poster specifically said they were not looking to have the AI write the report, just format the margins, font, line spacing Etc.

Some helpful soul replied along the lines of – just do the work yourself like the rest of us…you get the gist.

I am not advocating using AI to cheat. But I am aware of how modern word processors work – the concept of ‘WYSIWYG’ extends to far more than the text – sighted people can ‘see’ the margins, line spacing, justification Etc. Of course this information is also visible on the screen and blind people can navigate to it or can use a keystroke to have it spoken to them…but it takes a bit longer and you have to remember the keystroke. In fact, you have to remember a lot of keystrokes to use a graphical interface designed for mouse users using just a keyboard.

There is a Blindness Productivity Gap, everything takes a little bit longer, is a little bit harder to learn, a little bit harder to do, when you are blind. This is not about effort, or intelligence, it’s just how life works – ‘monkey see, monkey do’ is practically a law of nature.

One of the reasons why I am so excited about advancements in AI is that I see in it a real chance to finally reduce or eliminate the blindness productivity gap. I’d also like to live forever!

#Accessibility #Ai #AIsfortheBlind #Blind #Disability


Gemini Live on Android: Real-Time Updates or Real-Time Letdowns? accessibleandroid.com/gemini-l… #Gemini #AI


看 #特斯拉 #TESLA 季營收趨勢圖

相比於 #NVDA 的季營收趨勢,特斯拉季營收趨勢更顯得維持時間很短就產生另一條趨勢線。確實如此!!

除了季營收趨勢持續時間短,產生多條趨勢線外,季營收扣除銷貨成本和營業費用所產生的指標差距也非常大銷貨成本占超過70%,營業費用也占了超過10%。

換句話說,特斯拉的季營收指標趨勢僅供參考,不能成為投資人的信任的指標。因為當你看著季營收,以為特斯拉很賺錢,其實扣除成本後........

總歸比台廠的毛三到四還要好很多 🤣

#經濟 #財經 #美國 #財報 #數據分析 #資料分析 #AI數據分析 #AI #MathAI


看輝達 #NVIDIA 的季營收趨勢圖
清楚的三條趨勢線,很穩定。最新趨勢上升速度非常快,代表產能穩定擴張並且快速擴張。

這可以說是製造業的特性,穩定,如果訂單滿員,那麼就看產能擴張速度。所以這樣的季營收趨勢其實是產能擴張速度。

#經濟 #財經 #AI數據分析 #資料分析 #AI #MathAI #NVDA #英偉達


#AI spurs revolution for some visually impaired people:
bbc.com/news/articles/cg4lvw6v…

#BBC #accessibility


"Microsoft Office, like many companies in recent months, has slyly turned on an “opt-out” feature that scrapes your Word and Excel documents to train its internal AI systems. This setting is turned on by default, and you have to manually uncheck a box in order to opt out.

If you are a writer who uses MS Word to write any proprietary content (blog posts, novels, or any work you intend to protect with copyright and/or sell), you’re going to want to turn this feature off immediately.

I won’t beat around the bush. Microsoft Office doesn’t make it easy to opt out of this new AI privacy agreement, as the feature is hidden through a series of popup menus in your settings:

On a Windows computer, follow these steps to turn off “Connected Experiences”: File > Options > Trust Center > Trust Center Settings > Privacy Options > Privacy Settings > Optional Connected Experiences > Uncheck box: “Turn on optional connected experiences”"

medium.com/illumination/ms-wor…

#Microsoft #AI #GenerativeAI #AITraining #MSWord #Privacy #Word


Another random thing that made me happy tonight: I discovered that at some point #wordweb added a feature to disable linking of words in addon dictionaries. That makes reading definitions with a #screenreader much faster. It's a bit expensive, but if you frequently find yourself resorting to Google, or even worse #AI, to define words for you, it's worth the money and is largely #accessible as well: wordweb.info/#a11y


Nvidia ( $NVDA ) reported Q3 earnings of $0.81 EPS on $35.1 billion revenue, surpassing expectations. The Data Center business hit $30.8 billion, up 112% YoY, while gaming revenue reached $3.3 billion. Nvidia's stock fell 1% post-announcement.

What do you think about Nvidia's future in the AI-driven market?

#Nvidia #NVDA #EarningsReport #AI #TechStocks #DataCenter #GamingRevenue #StockMarket #Investing #MarketTrends


Nvidia ( $NVDA ) is set to present earnings after markets close on Wednesday, with all of Wall Street eager to see if the AI bull market continues. Founder Jensen Huang is expected to set the tone for the industry, as Nvidia supplies roughly nine out of ten AI training chips to data centers.

Will Nvidia's results reaffirm investor confidence in the AI boom?

#Nvidia #NVDA #EarningsReport #AI #TechStocks #StockMarket #Investing #DataCenters #WallStreet #MarketTrends


A study asked 50 doctors to make six different diagnoses for medical conditions. "Doctors who did the project without AI got an average score of 74%, doctors who used AI got an average score of 76%, and ChatGPT itself got an average score of 90%." "AI didn’t help doctors using it as much as anticipated because physicians “didn’t listen to AI when AI told them things they didn’t agree. Most doctors couldn’t be convinced a chatbot knew more than them." #LLM #AI #ChatGPT qz.com/chatgpt-beat-doctors-at…


I give up. Now there's a Swiss church that has an #AI Jesus....

"Artificial intelligence is revolutionizing religious life at a small church in Lucerne, Switzerland, where a virtual Jesus is caring for the concerns and needs of the faithful."

Hold on...
Isn't this from the film THX 1138?

Good god, we're living in a low-budget 1970s dystopia.

dw.com/en/switzerlands-ai-jesu…

#AI


Android Accessibility Progress Report: What’s Changed (and What Hasn’t) Since Spring 2024 accessibleandroid.com/android-… #Android #A11y #AI


Two quick updates: 1. Be My Eyes now seems to support Bluetooth headsets. Remember to grant Bluetooth access after updating.
2. PiccyBot’s latest update allows Pro users to describe videos up to 5 minutes long.
For YouTube videos longer than 5 minutes with an available transcript, PiccyBot will now give a summary of the contents. #Android #AI


Nvidia ( $NVDA ) will report its Q3 earnings after the bell next Wednesday, with analysts expecting EPS of $0.74 on revenue of $33.2 billion, an 83% YoY increase.

Nvidia’s stock is up 189% YTD, while AMD ( $AMD ) and Intel ( $INTC ) have seen declines of 8% and 51% respectively.

Nvidia’s Data Center segment is projected to bring in $29 billion, a 100% increase YoY.

#Nvidia #NVDA #EarningsReport #AI #StockMarket #Investing #TechStocks #DataCenter #MarketNews


📱 AI apps like Be My Eyes or Seeing AI are game-changers! Tasks like reading labels, navigating spaces, or describing objects no longer always need sighted help. AI offers reliability, privacy, and convenience—something human helpers can't always guarantee. 🛠️
#AI #Blind #AssistiveTech #AccessibilityTools


"To all my writer friends that use MS Word - Microsoft has turned their Al bits and bots on to automatically go through anything done via Word. Here's how to turn it off. File > Options > Trust Center > Trust Center Settings > Privacy Options > Privacy Settings > Connected Experiences"

On the Mac I found it under Word/Excel/Etc > Preferences > Privacy, and near the bottom was the checkbox for "Turn on optional connected experiences" which of course was on by default.

#office365 #AI #microsoft


I wrote my own essay on #AI in healthcare a few weeks ago which aligns very well with the interviewed nurse's position:
tante.cc/2024/09/24/a-choice/
#AI


This is the line I see over and over again:

“While technology offers promising alternatives, it's essential to recognize the unique benefits that guide dogs provide beyond navigation, such as companionship and emotional support.”

If this is true, if these benefits are so unique, why do only 2% of blind and visually impaired people in the UK currently use a guide dog?

#Accessibility #AI #Blind #Disability #GDBA #Glide #Glidance #RNIB


Also, the #AI#bots have more rights than us #blind humans. Blind folks had to fight for years and years for copyright exemptions that would allow us to access and read books. The AI bots are just allowed to read as many books as they want, and swallow entire libraries whole. Yup; I'd definatly be better off as a bot.


So much for being the 'ethical, safe one'

Anthropic teams up with Palantir and AWS to sell AI to defense customers

techcrunch.com/2024/11/07/anth…

#ai

#AI


I wish I was a #bot, not a #human. Whenever I get a #captcha, the #AI bots can all solve it just fine. I, being a #blind human, cannot.


I got so fed up of waiting for Meta to launch their AI for the Meta Raybans that I took the plunge and set up a VPN. Lo and behold, my glasses now have the AI feature. I’m really enjoying the ability to ask questions about my surroundings & get quick responses, but some of the answers have confirmed that AI is far from infallible. Today, I found a jar in the cupboard which I was pretty sure contained honey. I asked Meta and it said it was a jar of peanut butter. Wrong; it was honey, as confirmed by Be My AI. I suppose the contents are a similar colour, but I thought Meta might read the label.
But if it gets this sort of thing wrong, I’m not convinced it is going to take over the world. If it does, we’re in trouble.
#Accessibility #AI #Tech #Meta


A Day with JAWS 2035: When Your Screen Reader Scripts Itself

The morning light filters through your smart windows, casting a warm glow across the room. Your ambient AI assistant hums gently, “Good morning, Lottie. Would you like to prepare your workspace for the day?”

“Yes, please,” you say, stretching as the AI readies your home office. The blinds adjust automatically, leaving just enough sunlight to boost your energy without causing glare on your neuro-linked glasses. You smile, reflecting on the advances in technology since the days of fiddling with manual screen reader settings and customized scripts. Those days feel like a distant memory, thanks to JAWS’ AI-powered self-scripting feature—your personal assistant that knows exactly how to handle your work routine.

“Let’s get started,” you say, and JAWS springs to life, adjusting the audio tone to your preferred voice—smooth, confident, efficient. As your desktop computer powers on, JAWS begins analysing the applications you’ve opened, sensing your usual email, project management software, and a new program you’ve recently started exploring.

JAWS’ Real-Time Autonomous Scripting: A Custom Fit

“Good morning, Lottie. I’ve detected a new application in use: ResearchHub. Would you like me to generate an initial script for it?” JAWS asks in a gentle tone, its voice coming through the bone conduction implant in your ear.

You nod. “Yes, go ahead and script it.” This isn’t just any regular software; ResearchHub is dense, designed for researchers and developers with an intricate layout. In the past, navigating such software would have required hours of manually creating scripts or waiting for accessibility support. But today, JAWS’ AI-driven self-scripting feature allows it to analyse this program’s unique design and build custom commands as you go.

“Noted. I’ll adapt based on your usage patterns,” JAWS replies, instantly highlighting an unlabelled menu item. “I’ve labelled this as ‘Data Analysis.’ Would you like a shortcut assigned for quick access?”

“Absolutely,” you reply. Moments later, JAWS has created a keystroke, Control-Shift-D, which will take you directly to the Data Analysis section.

As you dive into your tasks, JAWS continues observing your interactions, quietly scripting shortcuts and macros that save you time with each click. You switch over to an email thread about your latest project, and JAWS dynamically adjusts, making sure to read each new message aloud with just the right level of detail. It’s responsive, intuitive, and seems to understand the flow of your work better than ever.

### Adaptive Behaviour Learning: Anticipating Your Needs

JAWS has learned over time what works best for you—like knowing when you prefer concise summaries over detailed descriptions or when to read full email threads aloud. Today, though, as you work through complex calculations in ResearchHub, JAWS picks up on repeated actions, noting your frequent need to access specific data fields.

Without you having to prompt it, JAWS speaks up, “Lottie, I’ve noticed you’re navigating back and forth to the Analysis Settings panel. Would you like me to create a macro for this?”

“Yes, that’d be great,” you reply, surprised at how quickly JAWS anticipates these needs. It assigns a simple command, Control-Alt-S, making it even easier for you to access the settings. With each task, JAWS quietly observes, creating personalized shortcuts and learning how to refine your workflow without interrupting your focus.

Your screen reader feels less like a tool and more like an assistant that adapts to your habits, reducing unnecessary actions and helping you move seamlessly between applications. You take a moment to appreciate the leap from manually scripting these shortcuts to having them generated in real-time, tailored perfectly to your unique style.

Dynamic Accessibility Adjustment: Visual Recognition on the Fly

Halfway through the day, you open a report in a new format. The document is packed with complex graphics, diagrams, and untagged elements—historically a nightmare for accessibility. But JAWS, equipped with advanced AI-powered visual recognition capabilities, is ready.

“Diagram detected: This appears to be a bar graph comparing quarterly performance,” JAWS announces, automatically analysing the content. “Would you like a detailed audio description, or should I just provide the key values?”

“Let’s go with the key values,” you respond, eager to save time. In seconds, JAWS summarizes the data, translating it into accessible content without needing additional third-party support. When you encounter z buttons in another application, JAWS instantly identifies them and provides real-time labels, adjusting the accessibility on the fly.

The thought crosses your mind how revolutionary this is. You’ve moved past needing someone else to make documents or software accessible for you. Instead, your screen reader adapts and scripts the solution independently, as if it’s actively learning how best to support you.

A Collaborative Community of Scripts

As the day wraps up, JAWS asks, “Lottie, would you like to share the custom scripts I created for ResearchHub with the community repository? Other users might find them useful.”

“Yes, please,” you reply. Knowing that the scripts you and JAWS have tailored today could now benefit others brings a sense of community to your day. In the past, each user’s customization stayed personal, but today, JAWS’ community sharing feature allows anonymized scripts to be uploaded to a shared repository, where other users can download them for similar applications. This feature isn’t just a convenience—it’s a small way to contribute to something larger than yourself.

You smile, thinking about the ripple effect of this community effort. As JAWS users across industries contribute their self-generated scripts, the database grows, improving access for everyone.

Reflecting on Progress: A New Kind of Independence

As you finish your work, JAWS reads aloud your notifications, wrapping up your day with a recap. You reflect on how far technology has come since those early days of assistive devices. Back then, using a screen reader required you to work around its limitations, painstakingly scripting or finding ways to access inaccessible software. Today, your screen reader does the heavy lifting, allowing you to focus on your work without the constant barrier of inaccessible content.

Looking back, you remember those initial frustrations, the hours spent tinkering with manual scripts, and the reliance on tech support for inaccessible programs. Now, JAWS’ AI-powered self-scripting has not only given you more control but also reinforced your independence. It’s not just a tool—it’s a partner in productivity.

As you power down, you realize that technology has not replaced your determination; it has amplified it. JAWS has become a proactive assistant, predicting your needs, adjusting to your habits, and making the inaccessible accessible. With the day’s tasks complete, you feel a renewed sense of autonomy—knowing that the tools at your fingertips truly work for you, enhancing not just your productivity but your entire work experience.

The screen fades to black, and the AI’s voice recedes, leaving you with a quiet appreciation for a world where technology supports your strengths, not your limitations.

#Accessibility #AccessAssistive #AI #AssistiveTechnology #Blind #Disability #JAWS #ScreenReader


The concern here is that old search engines can give you links and you can go into them to verify the content. AI taking over our search engines hides the sources of information, forcing us to blindly trust the AI and wherever it got its information from.

#AI

#AI


Happy birthday to ARPANET, the forerunner of the modern internet! 53 years ago, the first message was sent over this pioneering network, paving the way for a world of interconnection and innovation.

As a tech enthusiast, I am constantly amazed by the ways the internet has transformed our lives, allowing us to communicate, learn and share ideas across boundaries and borders. From social media to e-commerce, from telemedicine to remote work, the internet has become an essential part of our daily routines, enabling us to connect with others and access a wealth of information at our fingertips.

#ARPANET #InternetAnniversary #Transhumanism #TechInnovation
#InternetHistory #Networking #DigitalRevolution #Innovation #Communication #Cyberculture #OnlineCommunity #TechHerstory #NetNeutrality #OpenSource #Decentralization #DataPrivacy #Cybersecurity #DigitalRights #Accessibility #Inclusion #SocialImpact #DigitalTransformation #FutureTech #TechOptimism #InternetOfThings #AI

🔗 ethw.org/Milestones:Birthplace…


The new release of Seeing AI for Android can now describe videos up to 10 videos daily as of now. Share an MP4 file with Seeing AI, and hear it described, scene-by-scene.
Other features include:
Recognizing PDF documents.
After tapping "More Info" to get a rich description of an image, you can now ask questions on the same screen.
On the Product channel, both barcodes and Enhanced QR codes can now be recognized at the same time.
#Android #AI #SeeingAI


Exploring Samsung Galaxy AI’s Live Translate: In-Call Translation Put to the Test accessibleandroid.com/explorin… #Samsung #AI


What do people think of this response to ‘generate detailed Alt Text’ by Gemini:

Here is some detailed alt text for the image you provided:
Alt Text:
A large, menacing grey cat, reminiscent of a monster, stands on its hind legs with its front paws raised. It has glowing yellow eyes and a wide, toothy grin. The cat is surrounded by a fiery aura as it emits a powerful beam of energy from its mouth. The beam is directed towards the Empire State Building in New York City, which is engulfed in flames. The city skyline is visible in the background, with tall buildings and smoke rising from the destruction. The overall tone of the image is chaotic and destructive, evoking a sense of fear and awe.
Additional Notes:
• You may want to adjust the description of the cat's appearance based on its specific characteristics.
• If the image is part of a larger context (like a movie poster or book cover), you could add a brief reference to that context.
• The alt text should be concise and informative, providing a clear description of the image without being overly detailed.
Let me know if you have any other questions.

#Accessibility #AI #AIsoftheBlind #AltText #Blind #ComputerVision


I asked Amazon's "AI", Rufus, to summarize my space opera novella "The Warrior Within" for me. Here's what it told me:

"The Warrior Within is a 1980 fantasy novel by Angus McIntyre. It follows the story of a young man named Aric who lives in a world where magic and technology coexist. Aric discovers he has the ability to tap into an ancient power called the "Warrior Within" which allows him to become a powerful warrior.”

(TWW was published in 2018. There is no one in it named Aric)

1/

#AI

#AI


Interesting Python job on an AI Forensics team: you will work on algorithmic investigations techniques to hold social media accountable to their users and to the law.

ai-forensics.notion.site/Remot…

#GetFediHired #DataEngineering #Data #AI #Jobs #Safety


Did you know IBM helped Hitler carry out the Holocaust? Is it in any way surprising that we can’t do anything about Big Tech and toxic oligarchs like Elon Musk when even those who helped carry out one of the worst tragedies in human history remain unpunished? Should it come as a shock that the very same bastards are funding and pining for fascism today?

m.youtube.com/watch?v=1rhzge9r…

#BigTech #fascism #SiliconValley #ventureCapital #surveillance #capitalism #peopleFarming #BigData #AI


Sensitive content


It’s no secret that many of us in the blind community have embraced the rapid advances in Artificial Intelligence over the past two years. We've witnessed firsthand how these technologies can be a powerful force for good, especially within our community. AI-generated image descriptions have revolutionized how we navigate the online world, offering a perspective previously unimaginable. This impact is now undeniable, transforming how we interact with the world.”

I’ve declared the kingdom of the blind a republic—perhaps prematurely, but only by a small margin. With AI empowering us to perceive the digital world in new ways, we are no longer ruled by limitations, but actively shaping our future. Anthropic’s recent launch of ‘computer use’ marks the first steps into a new phase of AI evolution—one where AI agents begin to act independently on our behalf, initiating a shift in how we interact with technology.

As AI continues to evolve, so too will the Assistive Technology that many of us depend on. I envision a future where this intelligence becomes a true companion, guiding us seamlessly through both digital landscapes and real-world challenges. We may be just two years away from seeing JAWS, NVDA, or SuperNova transform into true Assistive Intelligence 1.0—or perhaps it will take a little longer. If AI has taught us anything, it’s that progress comes both more slowly than we expect and faster than we can possibly imagine.

What follows is my first attempt at describing how a screen reader of today could take the first steps towards becoming an Assistive Intelligence. If anyone wants to build it, I’d love to help if I can. Whatever you think, let me know what you think:

“Proposed AI-Powered Self-Scripting Feature for JAWS Screen Reader

Objective
The suggested feature seeks to integrate advanced AI-driven "computer use" capabilities, like those developed by Claude (Anthropic), into the JAWS screen reader. This functionality would enable JAWS to autonomously create and refine custom scripts in response to real-time user interactions and application environments. The aim is to enhance accessibility and productivity for visually impaired users, especially when navigating non-standard or otherwise inaccessible software interfaces.

Feature Description
The self-scripting capability would empower JAWS to analyse user interactions with applications, identify recurring actions or inaccessible elements, and generate scripts that optimize these processes. By enabling JAWS to perform this autonomously, users gain seamless and personalized access to applications without manual intervention, allowing for an enhanced, efficient experience.

The self-scripting feature will be powered by the following core functions:

1. Real-Time Autonomous Scripting: JAWS would use AI to observe user interactions with applications, especially non-accessible ones, and automatically generate scripts that improve navigation, label untagged elements, and streamline frequent tasks. For example, if a user frequently navigates to a particular form field, JAWS could create a shortcut to this area.

2. Adaptive Behaviour Learning: This feature would allow JAWS to recognize patterns in a user’s interactions, such as repeated actions or commonly accessed elements. JAWS would adapt its behaviour by creating custom macros, enabling faster navigation and interaction with complex workflows.

3. Dynamic Accessibility Adjustment: Leveraging Claude’s approach to visual recognition, JAWS could interpret visual elements (like buttons or icons) and provide instant labelling or feedback. This would be valuable in software with minimal accessibility features, as it enables JAWS to make live adjustments and effectively “teach itself” how to navigate new environments.

4. Community Script Sharing: Self-generated scripts, once verified, could be anonymized, and made available to other users via a shared repository. This would foster a collaborative environment, empowering users to contribute to a broader database of accessibility scripts for applications across various industries.

Value Proposition
This feature will address key challenges for visually impaired users, including the complexity of navigating inaccessible interfaces and the time-consuming nature of repetitive tasks. The ability for JAWS to generate its own scripts autonomously would mean:
1. Increased Accessibility: Improved interaction with non-accessible software interfaces.
2. Higher Productivity: Reduced need for external support or manual scripting, allowing users to accomplish tasks more independently.
3. Enhanced User Experience: Scripting and macro creation based on personal usage patterns -- leads to a more intuitive and personalized experience.

Technical Considerations
1. Performance: Processing real-time visual and user interaction data requires substantial computing power. A cloud-based model may be optimal, offloading some processing requirements and ensuring smooth, responsive performance.
2. Safety: Automated scripting must be closely monitored to prevent unintended interactions or conflicts within applications. Integration of safeguard protocols and user settings to enable/disable autonomous scripting will be essential.
3. Privacy: To ensure user data is protected, anonymization protocols and data privacy standards will be implemented. Data collected from user interactions would be handled in compliance with rigorous privacy standards, safeguarding user preferences and behaviour.

Conclusion
Integrating AI-powered self-scripting capabilities into JAWS would represent a significant leap in screen reader technology. By allowing JAWS to, when requested, autonomously learn, adapt, and script in response to user needs, this feature could provide visually impaired users with unprecedented control and flexibility in navigating digital environments, fostering both independence and productivity. The anticipated benefits underscore the feature’s potential to redefine accessible technology, turning screen reader into Assistive Intelligence.“

About the Author:

Lottie is a passionate advocate for the transformative potential of AI, especially within the blind and visually impaired community. She blends technical insights with a keen awareness of lived experiences, envisioning a future where AI doesn’t just assist but truly empowers. Her thoughtful reflections explore the shift from a "kingdom of the blind" to a republic, where emerging technologies like AI create new opportunities for autonomy and inclusion.

With a balance of optimism and critical realism, Lottie acknowledges the game-changing impact of AI tools like image descriptions while recognizing that more progress is needed. Her vision extends to the idea of "Assistive Intelligence," where screen readers like JAWS evolve into proactive companions, adapting to users' needs in real-time.

Known for turning complex ideas into actionable blueprints, Lottie is not just an observer of technological trends but a catalyst for innovation. Her proposals reflect a desire to elevate independence and productivity for blind users, pushing the boundaries of what's possible in assistive technology. Her insights continue to inspire conversations and shape the future of accessible tech.

I am the Blind AI, relying on AI every day to enrich my life. While my posts may occasionally benefit from AI assistance, the thoughts, perspectives, and final edits are entirely my own. AI is my tool, much like a calculator or spell-check, refining my expression but never replacing my voice.

#Accessibility #AI #AIsoftheBlind #Blind #ComputerVision #Disability #Innovation #JAWS #NVDA #ScreenReader #SuperNov


As a blind user, I find LLMs like ChatGPT pretty useful because they output in audio or text. However, I wonder if they start generating videos, it might not be useful for us. Many videos, like YouTube tutorials, are often optimized for sighted audience, and most likely models would be trained on these types of videos to generate in similar style. I can ask to describe the video, but it won't be the same experience as videos designed with accessibility in mind. #accessibility #LLM #AI


In the debate about whether AI / LLMs can reason or not it's good to remember this quote from 1984 from Dijkstra , a dutch computer scientist,
""The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim."
#AI #LLM #reasoning


Meta releases Spirit LM, a multimodal (speech text) model. #Multimodal #LLM #AI #ML ai.meta.com/blog/fair-news-seg…


Found this in a discussion about Strava but applicable to all tech companies

#AI

#AI