View Single Post
Old Yesterday, 04:49 PM   #18
mitsie
Member
mitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentametermitsie can solve quadratic equations while standing on his or her head reciting poetry in iambic pentameter
 
Posts: 21
Karma: 12814
Join Date: Jun 2025
Device: PocketBook 4
Cool

Quote:
Originally Posted by kovidgoyal View Post
In my experience they still make stuff up. I have had conversations with AI that go along the line of "I ask a question, it gives a made up answer, I tell it the answer is made up, it apologises and comes back with a new made up answer". If you happen to ask it a question about something not in its training data it's pretty hopeless. It's very much a case of you need to understand the limitations of the tool, if you do and use it keeping those in mind, its very useful.

That said, I have no objection to adding such a line to the prompt. It wont do any harm. You can try and see what effect it has, if any, by asking a custom question instead of using one of the quick actions.
That's why in my AI plugin using Retrieval Augmented Generation (RAG) and a Local LLM, you get a successful Answer 99% of the time.

With a Cloud AI, the temperature is midway between Creativity and Truthful Accuracy. The higher you turn the temperature, the more creative an AI's response..

With a Cloud Service such as Gemini, you don't get to dictate the Temperature.. with a local LLM, you have full control.

Like others have pointed out, an AI will simply answer do not know that book, and return nothing.. that's where RAG comes in.. you load the ebook into the AI model from your filesystem.. So no matter what eBook it is, doesn't matter if it wasn't published or your Mother wrote it.. the AI still has access to the material and can summarise it..

Not only is it beneficial to use Local LLMs because you have full control, it also stops the transfer of Authors data to Cloud AI services to Harvest eBooks..

Turn the temperature down on LLM to 0 so it doesn't "Make stuff up", use RAG for local eBooks for reference gives successful and accurate results 99% of the time.

That's why I written the AI plugins the way I did in the first place..
mitsie is online now   Reply With Quote