![]() |
#16 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
I dont follow th espeed of the content server is the same as the spped of the gui they use the same code for actual library operations.
|
![]() |
![]() |
![]() |
#17 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
I'll give it another shot after I finish this set. |
|
![]() |
![]() |
Advert | |
|
![]() |
#18 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
Unless your ebooks are really large I dont see that making a significant difference.
|
![]() |
![]() |
![]() |
#19 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
I have noticed that the first addition after a large edit (i.e. many entries edited in bulk) tends to be quite a bit slower than usual too. ... for that matter, even the individual additions locally (i.e. via calibredb directly to the library, not the server) are going slow. About the same rate (four or five per minute) as I saw with calibre server. Maybe the other ones were just faster? Last edited by kjdavies; 06-05-2020 at 02:33 PM. Reason: accidental double post |
|
![]() |
![]() |
![]() |
#20 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
Timestamps indicate the script has loaded 32 files in 11:14... slightly more than 20 seconds each. It used to be closer to 4-5 seconds each. Does library size make a difference? It's approaching 10,000 files. |
|
![]() |
![]() |
Advert | |
|
![]() |
#21 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
Additions via calibredb without a server are going to be slow, because the library has to be opened and read into RAM for each invocation.
|
![]() |
![]() |
![]() |
#22 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
It never occurred to me that it would do that (I mean, I think I saw that described in documentation, but forgot until you mentioned it). And changing to query as needed would be a significant architecture change, wouldn't it? Right now the process via calibredb takes about 21 seconds per title, with having half a dozen calls per addition. I wonder how difficult it would be to add a couple new parameters to 'calibredb add':
METADATA takes the form field:'value',field:'value',*field:'value',*field: 'value' (identifiers:'isbn:#####,asin:#####'). We can mix built-in and custom columns in other contexts, so I hope we can do the same here. I'd be inclined to add the same -m/--metadata to set_metadata (doesn't affect those who use it today because they're using -f). $ calibredb set_metadata -m *own:1,*source=OneBookShelf,*filename='somefilenam e.pdf',*filepath='o:/DriveThruRPG/Some Publisher',*filesize=3183845 11345 [hmm, weird. when I preview I see an extra space in 'somefilename.pdf' that I didn't put in] would let me do all the metadata in one pass... and if the change is made to calibredb add, I can do both steps at once and reduce the entire thing to a single call. It looks like using an OPF file can reduce the metadata setting to a single call (custom columns look kind of complex in the OPF file). It looks like adding a file won't accept an OPF, though. Last edited by kjdavies; 06-06-2020 at 01:20 PM. Reason: added 'OPF' comment |
|
![]() |
![]() |
![]() |
#23 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
Simply use the server, then the db is not re-opened on each call, it stays opened.
|
![]() |
![]() |
![]() |
#24 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
What I'm doing now takes 20 seconds. On paper the server should be faster. It takes 2.5 times as long. It does work. But it's not efficient for my purpose. (Pretending) I sleep 8 hours a night, I can run my load scripts and load about (8*60*3=) 1440 files per night, and use calibre normally while I'm awake (and since I'm not using calibre while I'm working, I can sneak in another 8+ hours load time per day, so 2,880 files/day). Or run via content server all day and load (24*60*6/5) 1728 files per day, and be limited to the content server web interface. This might see higher throughput overall if I can run multiple load scripts concurrently, but I haven't proven this does not slow them down. |
|
![]() |
![]() |
![]() |
#25 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
No, I mean use calibredb but connect it to the server. See the first couple of paras at https://manual.calibre-ebook.com/gen...calibredb.html
|
![]() |
![]() |
![]() |
#26 |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Thank you, Kovid
Before I say anything else, I'd like to thank you for your time, Kovid, and what you've created in calibre. Considering the size of my libraries -- this thread is mostly talking about only one of them -- I'd be lost trying to manage them.
I admit to some consternation regarding how calibredb performs, and that might be coming through in my writing. For that I apologize. In the meantime, I'm suggesting things that would make it work better for my purpose (and, I think, for others who use it as I do... which might be a small number of people). If you point me at the correct place in the code to make these changes, I can see if I can work up a patch. I think the command line changes shouldn't be that difficult (find 'metadata argument', parse metadata elements apart and apply update metadata function multiple times instead of once per calibredb call). |
![]() |
![]() |
![]() |
#27 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
When "--with-library=n:/libraries/rpg-auto", run time was ~20 seconds per entry. When "--with-library='http://localhost:8080/#rpg-auto'", run time was ~50 seconds per entry. It does work. It just works very slowly. If I can instead do the entire add in one call, or do add and set metadata (built-in and custom) in two calls, I expect I can greatly reduce the time needed. I'm not an efficiency freak, but I think in a use case like mine it could reduce the run time by a factor of four or five, and could reduce a job that I started last weekend to one day (67 hours runtime down to perhaps 15... not quite 'overnight', but two nights or night-and-while-working). Last edited by kjdavies; 06-06-2020 at 02:24 PM. Reason: added 'efficiency' comment |
|
![]() |
![]() |
![]() |
#28 |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Hang on. I have another way.
This almost certainly invalidates my warranty. Adding a new title has lots of baggage (copying files, creating/copying cover images, and so on) that I don't want to duplicate. Fair enough, calibredb can take care of that for me. Setting metadata, on the other hand, is just updating a record (if it's built-in, like publisher) or adding a record (custom column). I don't need calibredb to do that. I can handle that externally. I know that each script works with only one database at a time. I can do
If calibre-server means that each calibredb call is handled (more or less) atomically I might still be able to have two load processes running against the same library (brokered by the server) safely... but I don't count on that. Which is okay, I can have processes loading against different libraries and that's enough for now. |
![]() |
![]() |
![]() |
#29 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,359
Karma: 27182818
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
First off, I find it extremely hard to believe that calibredb performs worse through the server than without it, I certainly cannot replicate that. Do you perchance have an antivirus/firewall getting in the way?
Secondly you dont need multiple set_custom calls, you can set all metadata with set_metadata, including custom columns in a single call. Simply specify --field multiple times. |
![]() |
![]() |
![]() |
#30 | |
Zealot
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 112
Karma: 53342
Join Date: Jun 2013
Device: Sony PRS-600
|
Quote:
And strange, I did try setting multiple metadata (standard and custom) at once, using that syntax... didn't seem to work. However, I will try again. Perhaps I had it formatted incorrectly. set_custom doesn't have '--field', though. calibredb set_custom [options] column id value I take it I should try calibredb set_custom [options] [column id value]* |
|
![]() |
![]() |
![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Support multiple instances of same format in same book entry | masp | Library Management | 3 | 09-23-2014 10:44 PM |
Multiple identical server instances detected | didierm | Calibre Companion | 2 | 08-17-2014 10:19 AM |
Two or multiple instances of Calibre on one computer | clockmaker | Library Management | 2 | 06-30-2012 01:55 PM |
Replace multiple matching instances within paragraph? | murphycc | Conversion | 2 | 02-23-2012 09:53 AM |
Trouble with multiple content server instances | perx | Calibre | 3 | 02-17-2012 01:24 AM |