![]() |
#16 |
Enthusiast
![]() ![]() ![]() ![]() Posts: 29
Karma: 324
Join Date: Mar 2008
Device: ebookwise, n800, tablet, etc
|
Good point. I'd be ok with only one metadata download at a time (ie don't run more than one at a time), but getting results back in pieces (ie 100 at a time). I'm really curious if it's actually faster to do 5-100 jobs or 1-500 job. (the larger the job, the slower it gets, I find...)
|
![]() |
![]() |
![]() |
#17 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,419
Karma: 27757236
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
As I said, since downloads happen in short lived worker processes, there is absolutely no way that the speed of downloading can be affected by the size of the parent job. You will get overall faster results by splitting into multiple jobs because of the parallel downloads but that has the effect of hammering servers and so is not something I recommend doing.
And given that bulk downloading metadata for large numbers of books is not something that needs to happen frequently, I suggest you just queue up your job overnight and stop worrying about it. |
![]() |
![]() |
Advert | |
|
![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Bulk Metadata Download Problem | sweetevilbunies | Library Management | 6 | 07-04-2011 10:39 PM |
Bulk metadata download incoherent | madeinlisboa | Calibre | 6 | 06-24-2011 01:18 PM |
Split HTML Size to Speed-Up Page Turns | ade_mcc | Conversion | 2 | 02-01-2011 06:06 AM |
metadata in bulk | Lorraine Froggy | Calibre | 1 | 11-14-2009 09:42 PM |
Bulk Metadata Download | iain_benson | Calibre | 1 | 09-29-2009 11:42 AM |