A common AI slop pattern in #curl reports we see is when the AI finds an internal function somewhere in libcurl and then generates a POC for the user that uses this internal function in a way that makes it misbehave/crash. But internally we don't use the function like that, and wouldn't, because then it fails.
Piggo
in reply to daniel:// stenberg:// • • •daniel:// stenberg://
in reply to Piggo • • •Lucy
in reply to Piggo • • •AI slop security reports submitted to curl
Gistdaniel:// stenberg://
in reply to Lucy • • •daniel:// stenberg://
Unknown parent • • •Peter Bindels
in reply to daniel:// stenberg:// • • •Troed SÃĨngberg
in reply to daniel:// stenberg:// • • •It's obvious why the AI bug finders are up in arms over this. If they globber up this code and re-uses it in other code bases there WILL be problems there!
So, by not writing all of your internal functions as if they were external, you're making the AI coding tools less useful.
You're a bug.
ð David Sommerseth
in reply to daniel:// stenberg:// • • •You have this "when all you have is a hammer, everything is a nail" idiom .... using AI for security research to find issues is more like a misunderstood reversed approach ...
doragasu
in reply to daniel:// stenberg:// • • •xinit â
in reply to daniel:// stenberg:// • • •daniel:// stenberg://
in reply to daniel:// stenberg:// • • •curl disclosed on HackerOne: CURLX_SET_BINMODE(NULL) can call...
HackerOnenobletrout
in reply to daniel:// stenberg:// • • •the least they could have done was use markdown for the report.
serious question: why not add guards to make these sort of fake problems go away? i know it's annoying to write code that'll get compiled away but it could keep the bots away?
daniel:// stenberg://
in reply to nobletrout • • •nobletrout
in reply to daniel:// stenberg:// • • •i was wondering what the reason not to would be. this sounds like as good a one as any.
also i'm not a very serious person
daniel:// stenberg://
in reply to nobletrout • • •Cybarbie
in reply to daniel:// stenberg:// • • •