I have about 6 TB of files I need to move from one server to another. I did my best to move it over FTP like below, but the connection dies a lot, and after a certain amount of progress, it disconnects before it even resumes moving files from I presume taking too long to compare files before actually transferring and then timing out.
~/ncftp-3.2.3/bin/ncftpput -R -z -v -u "user" -p "password" upload.server.net /local/dir/ remote/dir/
I'm trying to get the remote server to give me access with SSH so I can set up an rsync, but is there anything more stable I can do over ftp, so maybe it'll try on it's own to resume without recomparing the entire file lists?
If your only access is via FTP, you may want to look into lftp (should be in most distros).
lftp supports automatic retries on failure and also has a mirror option which sounds like it matches what you want to do.
i'd use rsync. if connection drops it'd compare source and destination and sync from where it left (assuming large amount of small to medium files, not 2 x 3 TB :) ).
alternatively start apache and make your file dir root and do recursive wget, might work as well, you just need to tell it to ignore files that already exist locally.
You don't say what the boundaries are for this project. I myself in my situation would do this by backing up the data from the source and restore it to the destination using my backup software.
If you can't do things like that then why not try moving the data in "batches" if that's possible, if the data is split up into a number of files or directories?