As part of our development workflow, we would like to have our build server publish our compiled binaries and other build artifacts to some LAN-based, replicated file system. Ideally, we'd just have our build server drop the build artifacts into a folder according to the build version and the contents of that folder would be replicated to each developer workstations locally. Once published, each build is static and does not change. We don't have strong consistency requirements, we only need the files to be available on each developer workstation within a minute or so. All machines are Linux-based.
The easiest solution is probably a NAS with a rsync+cron, but that feels clunky. Is there another distributed/replicated file system that supports the above requirements? We looked at doing Amazon S3 + s3fs, but that was painfully slow.
In many ways this feels like traditional mirroring from an authoritative source in a webserver scenario, e.g. www1, www2, etc.
I think AFS (or OpenAFS) might be a good solution for you. Its been implemented in the linux kernel since 2.6.10 and uses Kerberos for authentication.
If you only need one-way replication, you really don't want a distributed filesystem. That introduces a lot of complexity for no gain. I'd go for the rsync approach. Have the last step in your build script be pushing out the artifacts if the build was successful. If you'd rather not make this be part of your build scripts, you could use cron to do it at certain intervals or lsyncd to do it when files are created or modified.
I would also suggest looking into approaching this with a distributed version control system such as
git
. You could use commit hooks to keep everybody up to date, and the independence you would gain by not doing this at the file system level would save you a lot of reliability headaches down the road. You might even find other advantages of having the stuff versioned along the way.