Quote:
Originally Posted by JimmXinu
Nice. My understanding is that the companies are trying to train for more 'I don't know' due to all the publicity about hallucinations.
|
That's a good thing. I've not used LLMs much so far, exactly because they're known to hallucinate (yes, yes, I know they're not intelligent or self-aware, but it's a convenient term to use) and you have to check their answers constantly, which is a bother. I do wish they'd just say so if they cannot find an answer.