Website Extractor for Linux

Hey, there are many website ripper or website copier in Windows operation system for offline browsing whereas in linux we dont have anything similar to it. No worries, we have got wget. wget is more powerful to extract the contents of the website and use it for offline browsing in linux.

Let us see, the power of wget now

1. To save file.

Syntax

$ wget .

Example

$ wget http://elgg.org/getelgg.php?forward=elgg-1.8.4.zip .

$ wget http://elgg.org/index.php .

2. To extract full website

Syntax

$ wget -k -r -l N

where k is for converting the links to local,

l is for the level and

N is the number

Example

$ wget -k -r -l 0 www.padasala.com

3. To extract specific part of a website

Syntax

$ wget -m -k -np \

Example

$ wget -m -k -np www.padasala.com/images/

where np is for no parent

Hope you enjoyed this. Happy offline browsing