It's not "hallucinating", that's a marketing spin term. It's an inevitable result of how it works.
Without a massive change in how data is fed to these, with costly human curation, it's never going to sensibly cite (or in anyway identify sources).
The builders / purveyors of these LLM don't even want to expose the sources.
The whole concept is broken.
|