Greetings, I am using ncftpput to transfer a lot of files from on server to the other
ncftpput -f server.txt -vRdb /public_html /var/www/site.com
What happens is, the connection gets cut after about 100 transfers. There are thousands of files that have to be transferred. Problem is after the connection is cut and I reenter the command, it starts from the beginning. Replacing all existing files, making the previous transfer redundant. Is there a way to skip existing files? It's not in the man page so I'm assuming no. So does anybody suggest an alternative command line solution?
lftp is a great tool for this, it's scriptable using the mirror command:
$ lftp -f
script_file:
You could try to work with ncftpput's
-DD
option. This will delete the local file after it was successfully transfered. If you don't want to delete files in your /public_html directory, you could copy them to a temporary directory first.Don't forget to set the FTPUSER, $FTPPASS, $FTPHOST The -DD option will delete the file after a successful upload. The "! -d "$FILE"" in line 4 will skip directories within server.txt. Don't forget to make the script executable with "chmod 755 yourscriptfile".
Have you tried this option?
-A Append to remote files, instead of overwriting them.
(from ncftp site)
Try wput
Is there any reason why you can't use rsync rather than ftp?
IMO rsync is the right tool for this job. It will only transfer new and changed files, which means that if the transfer dies and has to be restarted, it will pick up from where it left off rather than start from the beginning again.