bulk operations - a couple of questions
Hi there! I'm brand new here, so hopefully I'm posting this in the right place.
I'm using Calibre (huge love!) to manage a very large collection (100,000 books, about 100 GB) and I've run into a couple of problem areas.
First, with bulk cover and metadata downloads, I run into a python error when I try to update 4000 books at a time. If I break it up and do batches of say 1000 at a time, it seems ok. Is there any way to overcome this limitation? I have been processing in batches like this, but it would be way less labor intensive to do them all at once.
Second issue is copying from one library to another seems to slow down dramatically as the job progresses. The first few books copy really quickly, but as time goes on, I notice it taking longer and longer to process each book. Copying 4000 books takes many hours. Is that normal, and is there any way to speed it up?
|