I’ve realized that every hallucination of LLMs are associable and are never just total bullshit. Also, as far as I have experienced, the LLMs are capable to explain why they thought this way, so they are capable of explaining you how to avoid this. What do you think about this and have experienced something similar?
You must log in or register to comment.
The more it doesn’t know something the more it will create a perfect nonsense with a straight face. It’s like replies on twitter.