I had a recent request to download a lot of files (there were over 200 files) from a web site but instead of doing this manually I decided to find out if this could be automated. In one of my previous jobs I had learned about the wget utility and how you can get files remotely. So I downloaded wget on my computer (using cygwin) and then did a bit of reading online and used the following script to download the files I wanted:
wget -m -p -E -k -K -np http://site_url/path
And with that little simple command, I just let wget do it’s work and after a few minutes all files were downloaded and ready to be used!