I have about 600MB on a remote web server that I'd like to copy to a PC, then copy to a new server. I tried FileZilla, but it failed partway through the transfer from the old server and couldn't recover without losing track of what was already finished. Is there a better way?
Thanks for the replies. For clarification, here are more details of the problem and my final results.
Both remote servers are *nix systems with SSH and shell access. I hadn't considered moving the files directly without the intermediate PC; I decided that I wanted to keep it as part of the process, mostly to provide backup for the files. The PC in the middle is Windows, but I have most Linux utilities available as part of Cygwin.
My decision to copy to Windows had an unexpected benefit - it told me of a couple subtle details of my file structure that might have gone unnoticed otherwise. I generated checksums of all the files on both servers and the Windows PC with the following:
find . -type f -exec cksum {} \; | sort >sums.txt
I found a couple of mismatches in the Windows copy. First was two files that had identical names except for upper/lower case differences, which caused the second file to overwrite the first. Second was a soft link, which got converted to a regular file.
I think my initial problems with Filezilla were caused by the upper/lower case duplicate, which caused that file to abort. The server connection was lost by the time I checked back on the transfer status, which explains why I was unable to recover.
Rsync over SSH.
If files are exposed over http or ftp, use wget. It is like a pittbull. Holds and doesn't give up if yous specify unlimited retries. Downloaded a several MB file at about 6 bytes/second once :)
If you have shell access, you can also use rsync.
It would help to know what OS the servers are. FTP is going to be your best bet, I would think. Although, if you have access to, I would log into the server you're copying to and try to do a direct copy from the source server and get rid of the PC in the middle.
Linux or other *nix
Rsync over SSH as has been suggested in other answers. It's secure, will resume where it leaves off if there is any interruptions, can compress the files as it transfers.
Windows
Robocopy is probably going to be your best bet.
Keep using Filezilla.
Rsync of SSH is not better: Filezilla is capable of connecting to a SSH port 22 to tunnel to a FTP service using port 20+21 on that same machine. Filezilla already has the muscle to do this with its queing and SSL capability.
Filezilla has a very mature "resume" capability that should handle ANY connectivity dropout problem when properly configured.
If you want to avoid the unreliable dual port/channel nature of FTPS or FTP (which could be your issue) you could run the "mini" version of CoreFTP server on port 22 on the desination machine and connect to it with WinSCP or CoreFTP client and do the transfer over a single port.
FTP is probably the best way (ie what you are doing)
Your best bets now are compressing the file (ie zipping it up somehow) and splitting the file into smaller chunks using something like http://www.filesplitter.org/ (for example)
You can do both of these tasks using something like WinRar and then downloading these.
But of course it depends on what kind of access you have to the host machine and what the host machine is. Is it even windows?
I use FileZilla as you can set it to not download/upload files that are the same age or older. So you won't go over files unless they're newer. If I want to copy files directly from one server to another I use Rapid Transfer Script (www.rapidtransferscript.com). It can copy big 1GB files in under 90 seconds.