Using wget to download HTML website

https://apple.stackexchange.com/questions/100570/getting-all-files-from-a-web-page-using-curl

Replace example.com/website with the website you want to download files from.

wget -r -np -k http://example.com/website/

The above command will download all the files it can find in that web directory, i.e. (html files) This can be helpful if your trying to move a simple HTML site.

The -r option means recursive, the -k option converts the links to local links after it downloads the page.

wget multiple links with random access times

Create a file “list.txt” that contains all the URLs you want to download and launch the following command

for i in cat list.txt ; do wget ${i} && sleep $(( ( RANDOM % 120 ) +1 )) ; done

It’ll now run and after each link will wait a random amount of time up to 120 seconds before downloading the next link. Change the number as needed.

Download file from the web using curl

The following command basically does the same thing as wget.  This can come in handy since OS X and some linux distros do not ship with wget by default.

curl -O -L www.incredigeek.com/home/downloads/wget/wget-1.14.tar.gz

The two options do the following

-O, –remote-name Write output to a file named as the remote file
-L, –location Follow redirects (H)