How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
If you want something a little more advanced that wget, take a look at Black Widow
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want.
wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. Andhttrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.