I need to set up an SFTP server that, essentially, has very large capacity. I need to give one of our partners SFTP login details to a server where they will upload millions of files, totalling a few hundred Terabytes. I will then be selective and quite rarely reading some of these files. This is the only actual requirement, any technology choice is up for grabs.
What comes to mind as the easiest way is to have some sort of EC2 instance running the SFTP server in such a way that anything uploaded is either directly sent to S3, or some sort of process discovers new files when they get uploaded, copies them to S3, and deletes them from disk.
Is this the best way? Is there any other way of getting a server that essentially has "infinite and magically growing disk space"?
Thanks for your help! Daniel