HTTrack allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure.
WEBHTTRACK WEBSITE COPIER is a handy tool to download a whole website onto your hard disk for offline browsing. Launch ubuntu software center and type "webhttrack website copier" without the quotes into the search box. select and download it from the software center onto your system. start the webHTTrack from either the laucher or the start menu, from there you can begin enjoying this great tool for your site downloads
I don't know about sub domains, i.e, sub-sites, but wget can be used to grab a complete site. Take a look at the this superuser question.
It says that you can use -D domain1.com,domain2.com to download different domains in single script. I think you can use that option to download sub-domains i.e -D site1.somesite.com,site2.somesite.com
If speed is a concern (and the server's wellbeing is not), you can try puf, which works like wget but can download several pages in parallel. It is, however, not a finished product, not maintained and horribly undocumented. Still, for to download a web site with lots and lots of smallish files, this might be a good option.
I use Burp - the spider tool is much more intelligent than wget, and can be configured to avoid sections if necessary. The Burp Suite itself is a powerful set of tools to aid in testing, but the spider tool is very effective.
Try example 10 from here:
–mirror
: turn on options suitable for mirroring.-p
: download all files that are necessary to properly display a given HTML page.--convert-links
: after the download, convert the links in document for local viewing.-P ./LOCAL-DIR
: save all the files and directories to the specified directory.httrack is the tool you are looking for.
With
wget
you can download an entire website, you should use-r
switch for a recursive download. For example,WEBHTTRACK WEBSITE COPIER is a handy tool to download a whole website onto your hard disk for offline browsing. Launch ubuntu software center and type "webhttrack website copier" without the quotes into the search box. select and download it from the software center onto your system. start the webHTTrack from either the laucher or the start menu, from there you can begin enjoying this great tool for your site downloads
I don't know about sub domains, i.e, sub-sites, but wget can be used to grab a complete site. Take a look at the this superuser question. It says that you can use
-D domain1.com,domain2.com
to download different domains in single script. I think you can use that option to download sub-domains i.e-D site1.somesite.com,site2.somesite.com
You can download Entire Website Command :
Example :
If speed is a concern (and the server's wellbeing is not), you can try puf, which works like wget but can download several pages in parallel. It is, however, not a finished product, not maintained and horribly undocumented. Still, for to download a web site with lots and lots of smallish files, this might be a good option.
I use Burp - the spider tool is much more intelligent than wget, and can be configured to avoid sections if necessary. The Burp Suite itself is a powerful set of tools to aid in testing, but the spider tool is very effective.