I'm trying to mirror and download as static html files all of the links in an XML sitemap file.
I found the following command which is supposed to accomplish what I'm trying to achieve but it doesn't actually download anything:
wget --quiet http://www.mydemosite.com/sitemap.xml --output-document - | egrep -o "https?://[^<]+" | wget -i -
I found this thread here:
https://stackoverflow.com/questions/17334117/crawl-links-of-sitemap-xml-through-wget-command
So my question is, how can I mirror and download as static html files all of the links in an XML sitemap file using wget?
Thanks
0 Answers