I wasn't sure if this was the best place for this question, but I think it is squarely in the realm of the IT admin so that's the reason I put it here.
We need to share large files (several Gigabytes) with external clients. We need a simple way of reliably and automatically publishing these files so that clients can then download them. Our organization has Windows desktops and a Windows SBS 2011 server.
Sharing from our server is probably suboptimal from the client's perspective, because of the low upstream bandwidth of typical ADSL (around 1 Mbps) - it would take all day (9 hours for a 4Gb file) for the client to download the file.
Uploading to a 3rd party sever is good for the client but painful for us, because we then have to deal with a multi-hour upload.
Uploading to a third-part server would be less problematic if it could be made reliable and automatic, e.g. something like a Groove/SharePoint Workspace, simply drop the file in and wait for it to synchronize - but Groove has a 2Gb limit which is not big enough.
So ideally I'd like a service with the following attributes:
- Must work for files of at least 5Gb, preferably 10Gb
- Once the transfer is started, it must be reliable (i.e. not sensitive to disconnections and service outages) and completely automatic
- Ideally, the sender would get a notification when the transfer completes.
- Has to work with Windows based systems.
Any suggestions?
I assume you are looking for a server solution and not a SAAS product, otherwise this question would be offtopic.
Sparkleshare is an open source software that could satisfy your needs.
But I would recommend just using rsync to mirror to a remote server that has the bandwidth you require. You can set up this system in minutes and it does everything you want.
Just specify a source folder and drop your files in. Give your Clients web access or ftp access etc on the remote server.
I would recommend starting a shell script with an endless loop that executes rsync and sleeps for 1 second after each iteration. Compared to a Cronjob this has the advantage that you don't get parallel uploads that interfere. And with your large files this would be an issue that I see right now.
Rsync even has the advantage that the files first get uploaded as a "hidden" file with a . prefix and a random name, so taht the client only sees files that are finished uploading and have passed an integrity check.
If a file updates, rsync can also handle this efficiently.
Just this script should be perfect for you:
Just monitor its running, put it in autostart, etc.
You can also benefit from compression using rsync.
http://www.filecatalyst.com/
You can use the above - you can setup a dedicated windows box for this and create your folder structure for different clients, share those folders on your local lan, any file that goes in those folders - starts to get transferred automatically.
You can also provide your clients with a web interface url or a file catalyst client which connects to your server and downloads these large files.
Riverbed is another solution (google it).
Hope this helps