Using wget
command, how do I allow/instruct to overwrite my local file everytime, irrespective of how many times I invoke.
Let's say, I want to download a file from the location: http://server/folder/file1.html
Here, whenever I say wget http://server/folder/file1.html
, I want this file1.html
to be overwritten in my local system irrespective of the time it is changed, already downloaded, etc. My intention/use case here is that when I call wget, I'm very sure that I want to replace/overwrite the existing file.
I've tried out the following options, but each option is intended/meant for some other purpose.
- -nc => --no-clobber
- -N => Turn on time-stamping
- -r => Turn on recursive retrieving
This option works
info
-q is quiet mode so you can throw it in a cron without any output from the command
Use
curl
instead?I don't think you can do it unless you also download the directories (so pass the -x flag). If you know what the file is, you can do use -O filename, so for example:
wget http://yourdomain.com/index.html -O index.html
Untried: maybe you can work with
wget -r --level=0
.Another possibility:
curl -O
overwrites (but it uses a different way of choosing the file name, which may or may not matter to you).Why not put a small wrapper around the wget in your script?
The script could move all the files to a temporary location, then wget the remote files / web pages.
On success delete the files in the temporary location. On failure move the files back and raise an error.
There isn't a simple way to do what you want using just wget unless you know specifically the name of all files, in which case the -O option will allow you to force the filename of the file downloaded.