Using wget to download a page and browse it locally
Run: [carpincho@bender]$ wget -r --convert-links url
What the arguments do:
- -r (its the same using --recursive) Turn on recursive retrieving. The default maximum depth is 5.
- -k (its the same using --convert-links) After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the visible hyperlinks, but any part of the document that links to external content,such as embedded images, links to style sheets, hyperlinks to non-HTML content, etc.
- url (obvious :P)
Comments