I recently asked a similar question about how to transfer large (many gigabyte to many tens of gigaytes) files over the Internet when latency is high. I got a great answer, but only for downloads. When I want to upload something, from my computer that is behind a firewall that only allows outgoing TCP connections, to a server, I can't get lftp or any other tool to satuate my gigabit upload (to a gigabit server that is 115ms away). lftp seems to cap out around 36.88mbps, even when doing a parallel transfer, such as with this command:
lftp -e 'mirror --parallel=1 --use-pget-n=10 -R . /destination/path' sftp://user@destinationdomain
I also tried some variations of this command (e.g. using put
instead of mirror
, and different flags), but nothing gets anywhere near saturating my gigabit upload. It's possible in this case that there's a bad link between me and the server, since although both are gigabit neither is a particularly good network (both have issues when sending data to far-away places, sometimes). To test this hypothesis, I ran a speedtest.net speed test to another ISP in the same city as the destination server (the host of the server does not host a speedtest.net server), and I got this result:
which seems to rule out any issue with intermediate networks (at least, anything severe enough to cause 36.88mbps), which is further strengthened by the fact that not only does my route to this speedtest.net server use the same long-distance transit provider (Hurricane Electric), but it even seems to go through the same datacenter that my server is hosted in with the same provider!
So, I ask, how can I easily (meaning with a single command) upload from my computer to a remote server while saturating my gigabit connection, or, at least, a significant fraction of it as speedtest.net is able to do?