View Single Post
Old 08-09-2025, 10:15 PM   #17
amirthfultehrani
Junior Member
amirthfultehrani began at the beginning.
 
Posts: 9
Karma: 10
Join Date: Aug 2025
Device: Windows 11
Quote:
Originally Posted by Quoth View Post
Even if off by default, it's the thin end of a wedge that could import nonsense and eventually cost the user and environment.

People that don't know what LLM really is and costs will turn it on.

People can separately use LLM / generative AI if they want. No need to spoil the best ebook management system and set of tools.

Quoth, you have raised some very important points that were central to how I approached this feature, thank you for the reply. Let me address each point of yours directly:
  • "Thin end of a wedge / Spoil the best ebook system": 100% agreed, and I hope you may see how my intention is not to be this "thin end" or to "spoil" anything. My proposed feature is to be completely inert/invisible unless a user explicitly enables it - in other words, 100% opt-in. If you don't go into the settings and provide your own API key, it adds 0 overhead, makes 0 network calls, and uses virtually 0 resources. It is fundamentally just another tool in the toolbox, like the existing "Lookup" feature.
  • "Import nonsense": as it is, the suggested/proposed feature is strictly read-only. It never, under any circumstances, modifies the book's content. What it does at most is simply display the LLM's output in a separate, dockable side panel for the user to read, similar to how the current Lookup panel works - the book remains untouched.
  • "Cost the user and environment": This is for sure a hot-button issue. I think a wider discussion on this would definitely hold value, but, for sake of this thread, and to be neutral + clear, I shall say the following two things about the feature's design (in regards to LLM call transparency): 1. The user must provide their own API key -> they are in full control of their own billing/usage - Calibre itself wouldn't be connected to any paid service 2. The UI includes a session-based API call counter so the user is always aware of how much they are using the feature (and this can be expanded as desired)
  • "People can separately use LML": Absolutely, and this feature is meant to be for them! My goal is not to force AI on anyone. My goal is instead to streamline the workflow for users who already copy-paste text between the viewer and an external AI service. It is meant to be integration for convenience and not a fundamental change of purpose.

Thank you again for the valuable perspective, Quoth.
amirthfultehrani is offline   Reply With Quote