I need to pull files from an ftp server regularly and have found that I can get the files easily enough using wget
wget -m --user=yyy --password=xxxx ftps://host.com.au
and that works really nicely. The problem is, it obviously leaves the files behind on the remote server and the next time I run the script, it gets them all again.
I saw that wget has a --delete-after flag, which on first glance would seem to be ideal, but, as the man page says, it only deletes local files not remote ones.
Is there a way to achieve this end? It needs to be via ftp unfortunately as I don't have shell access or rsync access to the remote server. Should I be looking at something other than wget?