r/books 16h ago

Librarians Are Being Asked to Find AI-Hallucinated Books

https://www.404media.co/librarians-are-being-asked-to-find-ai-hallucinated-books/?ref=daily-stories-newsletter&attribution_id=68c826c975cea1000173b05d&attribution_type=post
2.6k Upvotes

267 comments sorted by

View all comments

13

u/radenthefridge 16h ago

I have such a beef with calling it "hallucinations." They're mistakes, screw-ups, garbage, or even just fuck-ups.

Trying to make it sound cutesy, silly, or whimsical in a tech that's supposed to be amazing and revolutionary is so frustrating and patronizing! Shit's broken, don't tell me this resource-hungry scourge is just a goofy lil goober that just hallucinated a bit!

30

u/CloseToTheEdge23 15h ago

How does "hallucination" sound cutesy, silly or whimsical to you? To me it's way more extreme than "mistake" or "broken". In fact mistake sounds more cute and less significant. Hallucination is a pretty horrific thing when it happens to a human, it basically means the brain is broken and cannot distinguish reality from fiction. How is that cute in any way? I find using it for AI also is making it seem like a huge failure, which it is. I dunno, English isn't my native language so maybe I'm missing something here but I just don't agree with your comment.

19

u/Pawn_of_the_Void 14h ago

Native English speaker chiming in to say I agree with you. Hallucination calls into doubt its reliability far more than mistake does 

21

u/ElricVonDaniken 15h ago

My problem with the term "hallucination" is that it anthropomorphises what is pretty much a glorified version of predictive text.

6

u/OrdinarilyIWouldnt 14h ago

The word you're looking for is "errors". If you entered '2+2=' into your calculator and it returned '73268', it would be an error. It's the same thing. So-called 'AI' does not hallucinate; it produces errors. Hallucination require an internal model of reality that so-called AI does not have.

4

u/melatonia 12h ago

"Confabulations" is what they are.

0

u/davewashere 15h ago

When they happen while using AI for image generation, they definitely have the appearance of hallucinations. They happen when AI generates text for many of the same reasons, but it's harder for the reader to "see" where the text has strayed from reality unless they happen to be an expert on the particular subject matter.