I'm a student and need to pull down lots of stuff from my professor’s website, preferably retaining some of the folder structure.
I'm working on Windows boxes and have access to Windows XP, Windows 7, and Windows Server 2008 R2. Way back in the day (2-3 years ago) I tried some utilities that mirrored web pages and that sort of thing and for varying reasons they never worked right, or I could never get what I wanted from them.
So, for example, these folders:
In essence, a real pain to try to download manually for later use.
I've tried this utility and either it's overkill, or not-simple-enough-kill because I could never get it to just download files to my hard drive.
Ideally, I'd like to recursively scan the folder, recreate the folder structure in some specified folder, then copy the files from the remote server to their corresponding folder on my local machine.
23 Answers
The simplest utility to download the files from web site recursively is WGET:
5Firefox addon: DownThemAll!
Chrome extension: GetThemAll
Look at using HTTrack:
3It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.