I'm currently using HTTP for pretty much all downloads. However the files on our UK based server is beings accessed by people all over the world often in 3rd world countries with dodgy internet connections (we give our software free to 3rd world countries).
The file is over 100MB in size and for a minority of people the HTTP download will think its complete after 0.5MB or some trivial amount. Repeated attempts to ask them to download again using HTTP after deleting their IE browser cache sometimes helps.
But what always works is to point them to the same file that they can download via FTP.
I'd always read that there was no advantage to using FTP over HTTP, but is there some kind of extra integrity checking that FTP uses? Is it worth just swapping completely over to FTP for our file downloads?
HTTP proxies can also limit object sizes or enforce timeouts.
FTP can support download resume meaning if your client in South Africa downloads 99MB and gets cut off he can pick up where he left off. From the sounds of it you could benefit greatly from this.
However, it should be said that not all FTP clients support resume so you will most likely want to have some good documentation available for your clients.