IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Humans are also susceptible to hallucination
Do we have reason to believe LLMs will consistently and perpetually be "worse" than human participants?
--

Drew
New Dunno.
I think the point is that LLMs are not and cannot be an "intelligence" as commonly understood. They can string together words that might, or might not, be a good approximation of information. Since the supposed point of them is to approximate knowledgeable humans ("they can pass the bar exam!!"), the fact that they cannot be hallucination-free seems to raise red flags. What good is an "expert" that you have to second-guess and double-check when it comes to anything important?

Hacker News thread where someone makes similar points about Google's "AI Overviews".

Yeah, humans make mistakes. But everyone knows that, and that's why "I want to speak to the manager" exists. What "manager" are we going to talk to when everyone we attempt to interact with is just a LLM instance?? We all know the ancient "computers cannot make mistakes" trope that gets trotted out when there are problems. There's a rather infamous example of that still playing out in the UK.

I'm sure there are examples of LLMs that do a decent job (what used to be called "expert systems" seems related, I think), but they didn't crawl TheOnion and Reddit in an attempt to get huge and crush their competitors. The good expert systems builders fed it known-good, or at least best-effort good, information. But even then, humans always have to check the work.

Didn't someone say that once garbage info - like putting glue on pizza, or eating rocks - gets into these LLMs, then it's impossible to get it out? They basically have to retrain it with new data?

We'll see.

Cheers,
Scott.
New That's an interesting test
Can you teach it that what it thinks it knows is wrong?

And again, show me that you can teach a Trumper that what they know is wrong and I'll concede that LLMs are worse than humans.
--

Drew
     Tailored responses for the user - (crazy) - (11)
         Better nuke it from orbit; it's the only way to be sure. -NT - (CRConrad) - (10)
             I want a toy but not a self-aware toy - (crazy) - (8)
                 Re: I want a toy but not a self-aware toy - (pwhysall) - (7)
                     Neither of us knows - (crazy) - (6)
                         arXiv.org - Ha[l]lucination is inevitible. - (Another Scott) - (5)
                             Humans are also susceptible to hallucination - (drook) - (2)
                                 Dunno. - (Another Scott) - (1)
                                     That's an interesting test - (drook)
                             Tangent -- re: "page-level autocomplete" - (CRConrad)
                             It's a solvable problem - (crazy)
             On the other hand, don't we do the same? - (crazy)

The recommended age to have a Ouija board is 8+ years. So you have to be 21 to drink alcohol but only 8 to summon the devil.
38 ms