I'm designing a new server setup for hosting multiple websites. (Shared hosting for my clients over at SliceHost.) I've recently moved away from the traditional LAMP setup and chosen Ubuntu, Nginx, php-fpm and mysql.
I like it a lot better then my old Apache, suphp, mysql setup. It works great, provided encapsulation between sites and uses substantiallly less memory. However I have one major maintenance problem. In order to have a recent version of Nginx and in order to use php-fpm I've had to compile these programs from source.
The reason I see this as a problem is that keeping track of updates, and build configurations will end up being a lot of work. For two programs (and a patch) I can handle it, but it seems like this setup would not scale with many packages and servers. Are there good ways to manage this situation? I'm sure people do this all the time.
Instead of compiling from source and deploying, build (or find) Ubuntu packages for the newer versions you need. Often you can take the build files from an old version and just use the newer source. You can then maintain your packages like any other, and only worry about tracking configuration files.
The Debian New Maintainers Guide is pretty helpful in this situation, specifically Chapter 9 - "Updating the package". - while it may seem scary at first, it can be as simple as
I package every piece of software using the distro's package manager. It has many advantages:
The core problem you've got is that rather than having the distribution track security updates and apply them for you, you've got to handle that yourself. You can make life a little easier, though, by subscribing to the security announcements list for your distro and filtering it for the updates you care about. I have a procmail script that it kept automatically up-to-date by my package builder (any package in there goes on the "let announcements for this package through") and anything that passes the filter (meaning "I need to consider a manual update for this") drops into the todo list (ticketing system) for further handling.
The standard fare would be to package the compiled software and create an apt repository.
I don't believe you can't automate much further. There's a definite overhead that will only make it worth it for a respectable amount of servers.
I agree 100% with Kamil Kisiel.
I came from 15 years of Windows and made a lot of research in order to discover what Linux flavor I'd choose to work with. I have just upgraded Fedora 13 to 14 and something I have to say is: mantain software installed from source is a real nightmare.
Despite the fact the oficial repositories are sometimes behind the cutting edge, install software from YUM or other package manager is fast, clean and more secure.
I know I can think differently sometime ahead but right now I think this is the best choice for a regular guy like me.
As others have mentioned, you should build your own packages. In order to keep up to date with the upstream sources, subscribe to their mailing list(s). Some have a low volume -announce list.
Yes this is more work, but that's what it costs to keep so bleeding edge.