I've been very much enjoying running an LLLM (Local Large Language Model) on my Mac with #Ollama the last week or so, asking it random questions without taking up copious amounts of someone else's resources, and leters of water in some datacentre somewhere. My Mac has fans but even with a complex question, they barely bother to spin. I can send it images as well, and the responses are certainly up there with GPT4, so it makes me happy.
This entry was edited (3 months ago)
Charlotte Joanne
in reply to Andre Louis • • •Andre Louis
in reply to Charlotte Joanne • • •Charlotte Joanne
in reply to Andre Louis • • •Andre Louis
in reply to Charlotte Joanne • • •Also, do check Apple refurbish store, they have great deals, and still fully warrantee, and they even allow you to add Apple Care as if it were a brandnew device. Highly recommended. I've purchased refurb Mac's from Apple before and never had problems with them.
Andre Louis
in reply to Andre Louis • • •Devin Prater :blind:
in reply to Andre Louis • • •Andre Louis
in reply to Devin Prater :blind: • • •Devin Prater :blind:
in reply to Andre Louis • • •Andre Louis
in reply to Devin Prater :blind: • • •Devin Prater :blind:
in reply to Andre Louis • • •destructatron
in reply to Devin Prater :blind: • • •The Cube
in reply to destructatron • • •destructatron
in reply to The Cube • • •