Skip to main content

Search

Items tagged with: bemyai


This is the Llava:34B LLLM (local large language model) running on my #MacBook Pro describing a #screenshot from #BBC News. This, to me, is as good info as #BeMyAI would provide using #GPT4, so it goes to show that we can do this on-device and have some really meaningful results. Screenshot attached, #AltText contains the description.
Lately, I've taken to using this to describe images instead of using GPT, even if it takes a little longer for results to come back. I consider this to be quite impressive.


I can't contact #Microsoft's #Disability helpdesk on #BeMyEyes. so what's happening is, when i tap on contact microsoft, the new intigrated #BeMyAI tries to help. and then, there is a call Microsoft button there in case the AI is unable to help. when I tapped on that button, it is not connecting to Microsoft, it is connecting me to a regular sighted volunteer. #Blind #Blindness #Accessibility


Alice was on the London Eye today, as one of her school friend’s decided to have their birthday party on it, and she took this picture which I ran through #SeeingAI and #BeMyAI because I really like both descriptions, so one of them has been included in the alt-text, and the other is threaded as it wouldn’t fit within the character limit. I think they bring something unique to the table in each case.