Quote:
Originally Posted by mbovenka
No, it's not *that* bad. With Calibre itself and the source on an SSD and the library on a dedicated 5400 RPM drive, I get about 25 adds/min (older mobile Core i7).
Library size also plays a role, as Calibre does basic dup checking when adding.
|
I meant when using solely HDD (particularly if there's just one for both source and destination). I reckon having the source on a different drive from the target is already a big help.
Mind, on a single "slow" SSD (Samsung 840 Basic) and i5-3450S, I get 1 book/second and that's probably only because I call
calibredb add --library-path per book which I expect is far less efficient than a normal batch import. No auto-merge/dupe check during import. I just use the Find Duplicates plugin after.
Quote:
Originally Posted by theducks
I think some of my speed issue is this 'refurbished Core Duo PC' came with 8G of slow end RAM (according to UserBenchmark) , DDR3-1100 instead of 1600 (listed by Crucial). I put in a WD Black drive thinking the drive was the slow part, but saw no obvious improvement.
|
A "Black" HDD isn't really notably faster than a "Green" one in this case.
Quote:
Originally Posted by Tanjamuse
I think the biggest library is 60.000 ...
|
Yeah, just leave it running overnight. Maybe run them in batches of 5-10K?