EN
Bash - download entire website with wget (recursive download)
4
points
In this short article, we would like to show how to download an entire website using wget
.
Quick solution:
wget -m -np https://some-domain.com/some/path/to
Â
More complicated configuration
In this section, you can see a more smart configuration that lets to download a website with related resources.
wget -m -np -nc -p -k -E -D some-domain.com --restrict-file-names=windows https://some-domain.com/some/path/to
Where:
|
is equivalent to Where:
|
| prevents following links outside the directory /some/path/to |
| prevents overwriting existing files (useful with interrupts or resuming) |
| downloads all elements that compose the page (images, css, js, etc.) |
| converts links to make downloaded website working locally/off-line |
| saves downloaded files with .html extention |
or: | prevents following links outside the domain some-domain.com |
| modifies filenames to make them work on Windows |
https://some-domain.com/some/path/to | downloaded website |
Â
Or just:
wget --mirror \
--no-parent \
--no-clobber \
--page-requisites \
--convert-links \
--html-extension \
--domains some-domain.com \
--restrict-file-names=windows \
https://some-domain.com/some/path/to
Â