the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
ai didn't democritize any of these things. People did. The internet did. if all these things weren't democritized and freely available on the internet before, there wouldn't have been any training data available in the first place.
the one single amazing thing that today's day and age brought us is, that you can learn anything at any time for free at your own pace.
like, you can just sit down, and learn sketching, drawing, programming, writing, basics in electronics, pcb design, singing, instruments, whatever your heart desires and apply and practice these skills. fuck, most devs on fedi are self taught.
the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away.
(not to mention said tech failing at it miserably)
reshared this
)
Zach Bennoui
in reply to Tech Goblin Lucy 🦝 • • •miki
in reply to Tech Goblin Lucy 🦝 • • •It democratizes it by making it available for the people who can't / don't want to / don't have the time for learning it.
We're already seeing non-programmers successfully create quite substantial coding projects with AI, to an extend which surprises even me, who was a huge proponent for AI in coding from the start.
Same applies to art, there are many people who need or want art (small business owners, hobbyist game creators, wedding organizers, school teachers), but don't have the budget for the real thing.
Of course, many artists and programmers don't want this to happen and try to invent reasons why this is a bad idea, just as phone operators didn't want the phone company to "force" customers to make their own calls, and just as elevator drivers tried to come up with reasons why driverless elevators were unsafe.
Tech Goblin Lucy 🦝
in reply to miki • • •@miki without trying to convince you of anything (your stance on ai is yours, i'm not trying to change it), I can assure you that the reasons why many developers see generating production code with AI as a bad idea are not made up.
I am all for exchanging ideas between folks with different opinions, but this had to be said.
miki
in reply to Tech Goblin Lucy 🦝 • • •I see putting a prompt into AI and hoping that the generated code is correct as a bad idea, especially in complex apps that have long-term maintainability considerations, or when security / money / lives are at stake.
For throwaway projects (think "secret santa style gift exchange for a local community with a few extra constraints, organized by somebody with 0 CS experience", vibe coding is probably fine.
For professional developers, LLMs can still be pretty useful. Even if you have to review the code manually, push back on stupidity, and give it direction on how to do things, not just what to do (which is honestly what I do for production codebases), it's still a force multiplier.
Tech Goblin Lucy 🦝
in reply to miki • • •@miki that's a reasonable middle ground we can somewhat agree on.
I haven't seen AI-generated code being the "force multiplier" some folks swear by, especially with newer things like the config changes in pipewire last year, but i guess ymmv
miki
in reply to Tech Goblin Lucy 🦝 • • •I think we're painfully re-learning the lessons we learned in programming over the last 70 or so years with AI, just like crypto had to painfully re-learn the lessons that trad fi got to learn in the last five hundred years.
Yes, you can 20x your productivity with AI if you stop worrying at all about architecture and coding practices, just like you can 5x your productivity without AI if you do the same thing. Up to a point. Eventually, tech dept will rear its ugly head, and the initial gains in productivity will be lost due to the bad architectural decisions. Sometimes that
miki
in reply to miki • • •