How to Use the wget Linux Command to Download Web Pages
How To Download A Website Using wget For this guide, you will learn how to download this linux blog. wget www.everydaylinuxuser.com It is worth creating your own folder on your machine using the mkdir command and then moving into the folder using the cd command. For example: mkdir everydaylinuxuser cd everydaylinuxuser wget www.everydaylinuxuser.com The result is a single index.html file. On its own, this file is fairly useless as the content is still pulled from Google and the images and stylesheets are still all held on Google. To download the full site and all the pages you can use the following command: wget -r www.everydaylinuxuser.com This downloads the pages recursively up to a maximum of 5 levels deep. 5 levels deep might not be enough to get everything from the site. You can use the -l switch to set the number of levels you wish to go to as follows: wget -r -l10 www.everydaylinuxuser.com If you want infinite recursion you can use the following: wget -r -l inf www.everydaylinu