Report issue Add example

wget

The non-interactive network downloader

Description

The wget command is used to download files from URLs using the HTTP, HTTPS, and FTP protocols. It is designed for robustness over slow or unstable network connections; if a download fails due to a network problem, wget will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off.

wget is non-interactive, meaning it can work in the background, after a user has logged off. This allows you to start a retrieval and disconnect from the system, letting wget finish the work.

Key features:

  1. Resuming interrupted downloads: Supports REST in FTP and Range in HTTP.
  2. Support for HTTP/HTTPS and FTP: Handles both protocols for broad compatibility.
  3. Proxy support: Can operate through HTTP proxies.
  4. Simplicity: Easy to use from the command line.
  5. Free and Lightweight: Open-source and has a small footprint.

Syntax

wget [options] [URL]

Options

Startup options:

Logging and input file options:

Download options:

Directory options:

HTTP options:

FTP options:

Recursive download options:

Parameters

URL: The address of the file or directory to download.

Examples

Download a single file:

wget http://www.example.com/testfile.zip

Download and save with a different name:

wget -O wordpress.zip http://www.example.com/download.aspx?id=1080

Limit download speed:

wget --limit-rate=300k http://www.example.com/testfile.zip

Resume an interrupted download:

wget -c http://www.example.com/testfile.zip

Download in the background:

wget -b http://www.example.com/testfile.zip
# Check progress with:
tail -f wget-log

Masquerade User-Agent:

wget --user-agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" http://www.example.com/testfile.zip

Test if a link is valid (Spider mode):

wget --spider URL

Download multiple files from a list:

wget -i filelist.txt

Mirror a whole website:

wget --mirror -p --convert-links -P ./LOCAL_DIR URL

Download specific file types (e.g., PDFs):

wget -r -A.pdf URL

FTP download with authentication:

wget --ftp-user=USERNAME --ftp-password=PASSWORD ftp://example.com/file.zip