Boris Mann

Open Source. Community. Decentralized Web. Building dev tools at Fission. Cooks & eats.



  • Last Edit: September 28, 2020

Download an entire website with wget

Source: Linux Journal

wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains \
     --no-parent \

You may also want to add --limit-rate=10k (or some similarly slow speed) so that you don’t trigger the site blocking you.

Checking this into a git repo and putting it on GitHub Pages is a good way of archiving sites.