#AI #Reddit #PublicSafety
'AI chatbots have had a long history of hallucinating, and Reddit’s version, called Answers, has now joined the list after it recommended heroin to a user seeking pain relief.'
#AI #Reddit #PublicSafety
'AI chatbots have had a long history of hallucinating, and Reddit’s version, called Answers, has now joined the list after it recommended heroin to a user seeking pain relief.'