I'm in the process of setting up a webserver and I was wondering what the best practice method of updating the website it contains would be. I'm aware that I should have some sort of 'test' server running in parallel so that changes can be tried their first before they go live. So how closely should the test server mimic the production box and is there any known method of easily deploying the same changes to the production box once they've been tested?
I really would like to have a foolproof method of testing and applying changes to the production website and any advice with regard to this is much appreciated.
Platform:
Working on a linux ubuntu server. Running mediawiki, wordpress and a mail server. I have root access to the box via ssh. Mediawiki and Wordpress both have PHP and MySQL backends. The mail server uses a MySQL database also.
We use a series of virtual machines, at least one for each live client server, that is kept in sync with the live machines (aside from some of the data being munged so we are not working with personally identifying or otherwise sensitive information).
Any series of changes that are due to go to a production machine are put together in a patch which consists of any files/scripts needed and a script (a shell script usually for a Unix-a-like environment, a batch file or vbs script or powershell script under Windows) that applies those files/scripts appropriately. We then take the relevant VM, update its database(s) from a copy of production if the VM's copy is getting too out-of-date (remembering to re-run the relevant SQL scripts for wiping or randomising sensitive data) and take a snapshot of it in this state (snapshots are supported by virtualbox and most vmware products including the free "vmware server", and probably most other virtualisation solutions too). We then apply the patch as we would to the production server by running the main control script. If there are errors applying the changes, the VM is rolled-back to the snapshot (which takes a matter of at most a few tens of seconds) so the patch can be updated and tried again. This repeats as needed until the patch applies cleanly. Once that is done some testers are asked to give it all a try to make sure the new stuff is working and other old stuff has not been broken (how long you spend on this depends on the scope and severity of the changes and your level of paranoia). If problems are found the roll-back, edit , and re-apply cycle is repeated until all seems well. Once everything seems fine, assuming time allows, one last rollback-patch-and-test cycle is run just-in-case. Once all this is done you hopefully have a patch that can be applied to the production environment by running a single script with appropriate parameters (i.e. relevant passwords, as authentication credentials on the testing VMs should differ from those on the production machines) and you can be pretty confident that it will apply cleanly and have the desired effect with no (or as few as possible) undesired ones.
Always take fresh backups prior to applying anything significant to a production system, no matter how much time you've spent testing the update, just in case, and always plan for at least a little downtime during which you can keep the other users out and give the resulting updated system(s) a further paranoia test (and roll them back to the latest backup if something went badly wrong - if your production environment is virtualised then the snapshot facility can be useful here too).
As a general process this can work on any environment.
Use capistrano (capistrano and php guide)
Test enviroment should mimic the production as closely as possible and be independent (so, no shared data etc).
It really depends on your definition of "live". e.g. do you mean a server with 100% uptime that cannot go down, or just a general question about updating a site.
The easiest thing you can do is simply enable FTP access to your box/account/storage and if you only have minor changes that do not break anything else, simply overwrite the file whilst the site is live and then the next time someone refreshes a page, it should load the new file.
If however you are making big changes that will break other items, you have (IMHO) a few choices:
I find that it works efficiently and well - Many users are not even aware of any downtime during the update. If you are running a shop or similar, it is probably a good idea to be honest and on the error page write "We are doing site updates, back online within 2 minutes" etc.
Also, the above is assuming that you have tested the changes and it is working fine – all you will be doing is a few file operations that will not take long at all.