View Single Post
Old 08-22-2014, 02:09 PM   #7
ms233
Member
ms233 began at the beginning.
 
Posts: 14
Karma: 10
Join Date: Aug 2014
Device: Kobo
Quote:
Originally Posted by kovidgoyal View Post
Because every copy has to check for duplicates against all existing books. As you copy more books the list of books to check against becomes longer.
Hi Kovid!

From my observation, each copy job starts fast and slows down as the job progresses. When I start a second job, the next job starts out fast and slows down. If it was the dupe checking, I'd think each job would get progressively slower than the one before it, which doesn't seem to be the case. For every job, the last 5% takes 10x longer than the first 5%. This is when copying a library of about 4000 titles into one that has about 100,000. Hopefully that makes sense.

Any insight on the python error when trying to download metadata for 4000+ titles? I'm guessing I'm hitting an maximum within the software for a list of books to update.
ms233 is offline   Reply With Quote