Installing Basic Linux tools on AlmaLinux 9 (tar, wget, htop)

The local team wizard Mark, ran into some issues while trying to setup a system with AlmaLinux 9. Tar wasn’t installed! What?! No worries. We can solve this by just installing tar with dnf. While we are at it, lets install some other helpful utilities.

sudo dnf install -y tar wget htop

Tada! We are back in business.

Limit Network Speed of wget

You can limit the download speed for wget with the –limit-rate option.

Example command

wget --limit-rate=128K incredigeek.com/file-to-download.html

Replace 128K with the rate you would like. Rate is in Bytes, K for kilobytes M for megabytes.

More info from the man pages

  --limit-rate=amount        Limit the download speed to amount bytes per second.  Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the m suffix.  For example,        --limit-rate=20k will limit the retrieval rate to 20KB/s.  This is useful when, for whatever reason, you don't want Wget to consume the entire available bandwidth.        This option allows the use of decimal numbers, usually in conjunction with power suffixes; for example, --limit-rate=2.5k is a legal value.        Note that Wget implements the limiting by sleeping the appropriate amount of time after a network read that took less time than specified by the rate.  Eventually        this strategy causes the TCP transfer to slow down to approximately the specified rate.  However, it may take some time for this balance to be achieved, so don't        be surprised if limiting the rate doesn't work well with very small files.

wget –limit-rate options

Using wget to download HTML website

https://apple.stackexchange.com/questions/100570/getting-all-files-from-a-web-page-using-curl

Replace example.com/website with the website you want to download files from.

wget -r -np -k http://example.com/website/

The above command will download all the files it can find in that web directory, i.e. (html files) This can be helpful if your trying to move a simple HTML site.

The -r option means recursive, the -k option converts the links to local links after it downloads the page.

wget multiple links with random access times

Create a file “list.txt” that contains all the URLs you want to download and launch the following command

for i in cat list.txt ; do wget ${i} && sleep $(( ( RANDOM % 120 ) +1 )) ; done

It’ll now run and after each link will wait a random amount of time up to 120 seconds before downloading the next link. Change the number as needed.

Download file from the web using curl

The following command basically does the same thing as wget.  This can come in handy since OS X and some linux distros do not ship with wget by default.

curl -O -L www.incredigeek.com/home/downloads/wget/wget-1.14.tar.gz

The two options do the following

-O, –remote-name Write output to a file named as the remote file
-L, –location Follow redirects (H)

How to Install wget on OS X

The wget tool is an extremely useful command that allows you to download files from websites from the command line.  Before you begin you should make sure you have Xcode installed.  If you don’t, just download it from the App Store, its free.

To install wget on your mac you first need to download the source code from here.  Next you need to extract the tarball.  You can do this by double clicking the file in Finder or you can run the following command in the terminal

tar -zxvf ~/Downlaods/wget-1.14.tar.gz

Run the rest of these commands in the Terminal app.

Next we will cd into the directory.

cd ~/Downloads/wget-1.14

Then,

./configure --with-ssl=openssl

Make it

make

and finally install it.

sudo make install

And of course make sure it works.

wget --help

Congratulations, you have successfully compiled wget from source.