![]() |
#1 |
Junior Member
![]() Posts: 3
Karma: 10
Join Date: Feb 2009
Device: none
|
![]()
I'm trying to download (just once) a page that doesn't like to be downloaded.
Therefore I get loads of "too many accesses per second" errors. How can I prevent this? I have two suggestions: (1) Slow down the download so something like 1 page per second. How would you do this? (2) I can split it up into maybe 50 threads. In that case, I would have to set up a "master page" which links all these in one file. How can you do that? Sorry to post this in an archive-thread, but where should I post this? |
![]() |
![]() |
#2 |
Junior Member
![]() Posts: 3
Karma: 10
Join Date: Feb 2009
Device: none
|
![]()
For those of you who have the same problem, here's one solution I found:
I made a local copy with WinHTTrack Website copier http://www.httrack.com/ (GPL) which allows you to set the speed of your download (B/sec, requests/sec or break after XB). After that I let sunrise encode from the local copy. |
![]() |
Thread Tools | Search this Thread |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
PRS-600 Slow-down | lininjim | Sony Reader | 3 | 07-06-2010 09:13 AM |
Very slow download of 0.7.4 | Malcolm | Calibre | 5 | 06-22-2010 06:05 AM |
calibre - very slow conversion, very slow on PRS | cremofix | Calibre | 3 | 06-10-2009 04:21 PM |
Plucker: Help needed with spidering | goybert | Reading and Management | 0 | 07-26-2006 05:06 AM |