If a made-up eye disorder can fool major chatbots, get repeated with clinical confidence, and then slip into a peer-reviewed journal, the lesson is not just that AI hallucinations are annoying. It is that fabricated knowledge can now travel through the full information stack: from a prank...
AI chatbots are getting better at sounding authoritative, but the latest “bixonimania” episode shows how badly that confidence can outrun reality. A fictional skin condition invented by Swedish researcher Almira Osmanovic Thunström was absorbed by multiple major AI systems, repeated as if it...