View Single Post
Old 08-10-2010, 08:42 PM   #1
Junior Member
blither began at the beginning.
Posts: 1
Karma: 10
Join Date: Aug 2010
Device: PRS-505
Arrow Scalability and performance

I've been using Calibre for over a year with my Sony PRS-505. Actually, I read about Calibre and that's what helped me pick the Sony...

Anyway, my ebook collection is pretty large; over 20,000 books. I have most of the Gutenberg titles, lots of PDFs and RTF/DOCs from work, and LIT/ePUBs that I have purchased. I have been converting as much as possible into ePUB, which has gone pretty well. I've got Calibre running on Windows and on Linux (catalogs synched with DropBox).

My question is about performance and scalability tweaks in Calibre.

Searching and filtering is pretty good - if your tags are up to date. Even with a lot of tags on a lot of books, Calibre does a good job of sorting, which is important with a big library. But when I want to select a bunch of books to edit their tags, the process takes a LONG time. I'm still watching my system churn as it tries to add one new tag to 300 books that I just selected. It's been chugging along for about 10 minutes and will probably go for another 10 minutes.

I'm not complaining, because Calibre *works*! It may take a while, but it gets through the job successfully - eventually. My concern is about large jobs in general. I have done things like export the entire library and try to import it. The export went OK, but I can't import more than about 8000 books without Calibre crashing. It gets to one title or another and hangs for a while before eventually keeling over.

My library might not be typical, so any suggestions/recommendations for how to optimize performance with a large library would be appreciated...
blither is offline   Reply With Quote