Bash - download entire website with wget (recursive download)

4 points
Created by:

In this short article, we would like to show how to download an entire website using wget.

Quick solution:

wget -m -np https://some-domain/some/path/to


More complicated configuration

In this section, you can see a more smart configuration that lets to download a website with related resources.

wget -m -np -nc -p -k -E -D --restrict-file-names=windows https://some-domain/some/path/to


-m or --mirror

is equivalent to -r -N -l inf -nr


  • -r or --recursive - downloads the entire Website recursively
  • -N or --timestamping - assigns timestamps to downloaded files to do not download them again during the update
  • -l inf - sets the unlimited recursive level
  • -nr or --no-remove-listing - prevents removing the temporary .listing files generated by FTP retrievals.

-np or --no-parent

prevents following links outside the directory /some/path/to

-nc or --no-clobber

prevents overwriting existing files (useful with interrupts or resuming)

-p or --page-requisites

downloads all elements that compose the page (images, css, js, etc.)

-k or --convert-links

converts links to make downloaded website working locally/off-line

-E or --html-extension

saves downloaded files with .html extention


or:  --domains

prevents following links outside the domain


modifies filenames to make them work on Windows
https://some-domain/some/path/todownloaded website


Or just:

wget --mirror \
     --no-parent \
     -no-clobber \
     --page-requisites \
     --convert-links \
     --html-extension \
     --domains \
     --restrict-file-names=windows \


  1. wget(1): non-interactive network downloader - Linux man page
  2. wget - for Windows
  3. Bash - how to install linux bash for windows

Native Advertising
Get your tech brand or product in front of software developers.
For more information Contact us
Dirask - we help you to
solve coding problems.
Ask question.

â€ïžđŸ’» 🙂