Quote:
Originally Posted by edeniz
Is it normal that the "download_metadata" part of the download process alone takes up to 5+ minutes? I mean, I probably can ascribe that to there being far too many "exclude_metadata_pre", right? Or should I look for faulty/inefficient lines?
|
How many stories you're downloading/updating and which sites they are on is usually the biggest factor.
The second biggest factor in my experience is the number of story URLs in your Reject List.
Both of those will be further slowed by large numbers of replace_metadata or in/exclude_metadata lines.
I have some new changes I'm currently testing that speed up the Reject List checking considerably as well as adding some new ini functionality. I plan to post a new version for others to test early this week.