note.computer/@dbreunig/113330…
Drew Breunig (@dbreunig@note.computer)
Simplifying the AI noise by segmenting everything into 3 big use cases: Gods, Interns, and Cogs. https://www.dbreunig.com/2024/10/18/the-3-ai-use-cases-gods-interns-and-cogs.htmlnote.computer
Matt Campbell
in reply to Simon Willison • • •AGI (Artificial God Incarnate) - LIVE
YouTubeSimon Willison
in reply to Matt Campbell • • •Knowledge Worker
simonwillison.netSimon Willison
in reply to Simon Willison • • •What's Left for Me? (The AI Existential Crisis Song) - LIVE
YouTubeMatt Campbell
in reply to Simon Willison • • •You sure you linked to the right song? Have you gone through the kind of existential crisis portrayed in that song? I always saw you as being cautiously optimistic about AI, that it can be a useful tool if used well, but not making us redundant.
@forrestbrazeal
Simon Willison
in reply to Matt Campbell • • •Matt Campbell
in reply to Simon Willison • • •Simon Willison
in reply to Matt Campbell • • •@matt @forrestbrazeal I've not felt the anger personally, because I've been releasing open source code for 20+ years so I already default to "I want people to be able to reuse my work as much as possible" - but I absolutely understand the people who ARE angry
If I was an artist and someone trained Stable Diffusion on my work without my permission and then started competing with me for commissions that used my own personal style I think I'd feel very differently about this all
Glyph
in reply to Simon Willison • • •Simon Willison
in reply to Glyph • • •Matt Campbell
in reply to Simon Willison • • •About running models locally, I've experimented with that, but large context windows take up lots of RAM, right? Like, isn't it O(n^2) where n is the number of tokens? Or do you not depend on large context windows?
@glyph
Glyph
in reply to Matt Campbell • • •Matt Campbell
in reply to Glyph • • •@glyph Yeah. Today I went down a rabbit hole looking into what it would take to reproducibly, deterministically build some video assets for a little side project I did. It reminded me of how some hard-core free software folks, particularly the GNU Guix developers, have been working on bootstrapping a whole OS with minimal dependence on existing binaries. We're never getting that for these big models.
@simon
Simon Willison
in reply to Matt Campbell • • •@matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant
This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though github.com/microsoft/BitNet
GitHub - microsoft/BitNet: Official inference framework for 1-bit LLMs
GitHub