Actually, I tried it out on one site in particular that hosts a TON of literature that is in the public domain...so I could read it offline rather than pay the $18.95 for the DVD.
But it didn't work. Not completely. It took 4 hours to download all of it and it did as it was supposed to but there were gaps. None of Aldous Huxley's writings were downloaded and heaven knows how much else didn't get downloaded.
I did check and the website admins have tools in place to combat this kind of thing since it places a heavy load on their servers.
Oh well.
Sorry you had this experience on your first try. Uh, gosh, going only from memory, I think there is a setting that allows you to either stop or slow the connection so that it doesn’t trigger the blocking mechanism on the website you’re trying to download.
I use it to backup my websites, and I get an email if my bandwidth is exceeded for the time period I specify. I have the option to disconnect the user automatically or use preset limits.
All that said, I set it about a year ago, and haven’t had to look at any settings since, so my memory isn’t clear on that.
If you can’t find those settings, try only downloading one or two levels at a time, then repeating it for the ones you left out. I’m pretty sure the links will be intact when you finish.
Do you only use windows? There exists free open sourced software for the Linux community that does the same thing. Some, like wget (google wget for windows), have been ported to windows and have a myriad of switches that will accommodate websites that limit your downloads, like FReerepublic, for instance. :)
Best of luck, FRiend!