I am designing a process for end users to upload files to an FTP server. The critical requirement is to ensure the connection to the server is secure.
I know it is possible for many FTP client applications to create a secure FTP connection (e.g. FTPES or SFTP - and yes the FTP server does support these) but its an optional setting in the client. In other words we can request people create secure FTP connections but we can't force them to.
I should mention here that the FTP server belongs to a third party provider and if there are server settings to enforce FTPES or SFTP connections, we can't get these enabled.
So, the question is - is there a way to enforce secure FTP connections? Here's a few speculations:
Maybe there's an FTP client that forces a secure FTP connection to be used and I tell end users this is the only client they can use. This is a bit lame!
Maybe there's an FTP client that can get its FTP connection details (url, protocol & login) from a remote server (i.e. a server I control) and therefore I can dictate them and the end user never sees them.
Maybe I could establish some kind of "2 hop" connection where the user initially connects to a server I control that requires a secure connection but (transparently to the user) the connection is actually redirected to the real FTP server.
Your third idea seems most promising: consider an FTP Proxy Server. Your users connect to the proxy with the connection requirements you set, such as encryption, and the proxy server connects to destination server with the parameters you configure.
Unless you can either enforce or audit a policy, you can not get users to follow it. And a security framework is only as strong as its weakest link.
The scenario seems strange to me - you have a requirement for confidentiality, to be met by encrypting the data traffic, but you are working with a third party that won't meet this requirement. There may be a need to run the problem up the management flag pole, as well.
One possible approach-set up your own server for them to internally save things to, then set up a script to mirror the data to the remote site. That means you're taking responsibility for securely transferring the data at set times rather than leaving it to the user (which may even eliminate some headaches for you, unless you need data on the remote server available immediately).
Unless the server can be set to only allow secure connections I don't know of any way to actually make 100% sure the users are doing what they're supposed to be doing, and you already said you can't reconfigure the third party server.
Another bonus to this approach is that you have a kind of "backup" of the FTP data and can get faster access locally to data rather than going over the webbertubes to access it. Can save on your bandwidth having this "ftp proxy".
If you want to dig further into it you could even have a procedure where your server has a script that can be triggered by users to do the upload to the remote site, or maybe there's a cron job that can simply check every five minutes for file changes in an upload directory that will trigger the upload process to the remote site.
I think only the multi-hop approach is really going to work.
Maybe you could firewall the server, so that it doesn't accept connections on the standard ftp port?
Buy a virtual machine somewhere away from your bandwidth/storage and do what Bart suggests.
(By sftp I assume you mean the ssh-related sftp.)
You can force them to use a secure connection if you only allow secure connections to the server. If they don't use a secure connection then they can't get in, they call you, you tell them to configure their FTP client to use a secure connection. Problem solved.
I assume, particularly since you are referring to a third party ftp server, that the ftp and sftp services are running on their respective default ports, which are different. Are you able to configure your firewalls to stop ftp connections to that server, but allow the sftp/ssh transfers?