I have what could be considered a curious setup, maybe:
- A local folder on my desktop stores my library; this is synced up to a remote server via NextCloud.
- I run the Calibre GUI on my desktop to manage the library.
- I run Calibre, in content-server mode, on the remote PC, for convenient access via Calibre Companion.
Out of concern for maintaining database integrity, my desktop install is wrapped in a script that first terminates the Calibre content server running on the remote system, then starts Calibre locally; once the GUI exits, the script then sleeps for 60 seconds, to allow time for the NextCloud sync to finish, then restarts Calibre in content-server mode on the remote PC. I've considered hacking it to use
inotify, but that just seems to be asking for trouble...
If anyone's wondering, all my PCs are Linux-based, so this kind of cross-network manipulation is pretty trivial. A combination of public key login, a systemd unit file and carefully constructed sudoers files...
I've had it running this way for a solid six months, and it seems to work really well so I'm not looking to fix a problem. But I'm wondering if maybe I've solved a problem that doesn't actually need to be solved.
If I just leave the remote Calibre running constantly, will it handle it safely if/when the NextCloud sync updates the library files? What little I've found online implies not, whichis why I decided to do it this way.
Before anyone points it out, I'm aware I could just run the remote server constantly, and connect to it using the local Calibre desktop, but I've got a number of reasons for running it this way--not least of which is it serves as a rudimentary backup solution.
Sorry if I'm re-asking a question, but I've been looking for this for months, so I don't
think anyone's asked it already...