Sat May 30 22:27:15 PDT 2015

Making a Copy of a Web Site

Do you want to make a local copy of a web site for future reference? (Perhaps your boss has asked you to copy the internet to his laptop, for his convenience, for example). Well, for a specified site, that task is relatively straightforward with wget - which you can easily find on Linux, on Windows as part of Cygwin, and OS X as an add-on.

Here is the synax of the command to download this web site:

wget --mirror -w 1 -p --html-extension --convert-links -P ./ http://wwww.themolecularuniverse.com

The options here are:

'-mirror' get a copy of everything on the site

'-w 1' wait for 1 second for each page - this is optional and reduces the load on the site

'-p' get the prerequisites for the page - this means that linked graphics files, for example, will be downloaded too

'--html-extension' change any files which don't have an html extension to html

'--convert-links' make links in retrieved documents point to their local versions

'-P ./' put the retrieved files into the current working directory

wget can be very helpful...but of course you will just acquire a static copy of the target site and you will not get updates, enhancements, or corrections!

Comments are closed


If you would like to get in touch with me, please mail zfs at themolecularuniverse.com

recent comments

Posted by ZFS | Permanent link | File under: bash
[StumbleUpon] [Digg] [Reddit] [Facebook] [Google]