Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
lynx97
11mo ago
0 comments
Share
LLMs slurp up a lot of trolling and typical tech sarcasm through its training data. IMO a reason for "hallucinations".
0 comments
default
newest
oldest
alpaca128
11mo ago
That depends on how you define hallucinations, I'd say AI repeating its training input is doing exactly what it's made for. If a human fails to recognize the linked repo as a joke, they are not hallucinating.
lynx97
OP
11mo ago
Thats why I put hallucinations in quotes.
j
/
k
navigate · click thread line to collapse