Quote:
Originally Posted by BetterRed
Just for the giggle, I used the QC PI to swap last & first names on 137 authors, 181 books, 36 books had videos of 3-5GB - it was done in a second or two as I watched the author folders change before my very own eyes in my file manager. Given all the files were renamed I conclude that the files are moved using hardlinks, either by calibre itself or the file system - if they had been copying it would have taken about 4m40s
I conclude that there is nothing wrong with the QC PI, its working the same today as it was last week, last month... That one user in say 10% of 3.5M ( Usage Statistics) has a performance problem, suggests to me that the problem is localised to that user's environment.
Hoods7070 if the drive that your library is sitting on is a FAT 32 drive, then swapping names will take longer because the actual files are copied and renamed. On an NTFS drive its done via a hardlink create/rename and hardlink delete which all happens in the MFT which is preallocated space and contiguous sectors - and hardlinks are very small, 128 bytes rings a bell - MFT entries are not sector bound.
BR
|
Well, I agree. From all reports from responders to this thread it's clear the issue is from my end. That's why I've been asking for and taking note of most suggestions. I still believe Calibre has a part to play, though, even though there's nothing to be done about that.
Probable culprit No. 1 - bloated database
I'm sure theducks has it right about my database changing size.
Just for a giggle, to quote Red, using my daughter's library of 202 books, 15 authors, from a USB3 FAT32 portable drive, I ran QC FN/LN swap:
- 45 seconds to apply the on the whole library from my husband's always slow i3 (USB2) laptop
- 30 seconds on my (USB3) laptop
- 30 seconds on my PC. However, interestingly, only 20 seconds after I removed the main database library. Perfectly acceptable, but not as fast as I remember it being.
Anyway, that pretty much proves the point - a) it's not the number of books you're working on that counts, but the size of the library being worked on. b) clearly Calibre has a limit, and c) the speed is affected by what other libraries are registered on Calibre.
Of interest and by compoarison is that my accounting software (Quicken), which I have also migrated to the laptop, has 12 years' worth of very detailed data, with umpteen "tag" equivalents, is unaffected by being on the laptop. So I assumed Calibre would handle a big database as easily.
Possible culprit No. 2: AV
I have changed nothing on the PC, but my AV HAS been updated and the new version is also on my laptop. I am going to call them and ask them how to correctly whitelist Calibre. I think I HAVE, but I will double check.
Unlikely culprits:
The laptop HDD is ATA NTFS, my PC drives are standard SATA NTFS 72s. The only difference between mine and perhaps the major percentage of other systems is that I have all my data, including Windows system data folders, on a separate partition (laptop) or drive (PC) but that really shouldn't make any difference, as my drives are all set to 00 response time.
Anyway, I won't belabour this any further - it is what it is. I will divide my library up into countries and likely reads, and see if that makes the laptop better. I hope so as that's what I bought it for. Ironically, I HAD split my database as above, but changed my mind and put it all back into one again, and that's how I lost my tags ...
Thanks for all help offered - gratefully received.