My company will shortly be setting up a blog, and I'm planning on setting up two web servers to host the wordpress website for redundancy. Normally when we do releases to a site in a farm we push to one side, test and then release to the other side. For wordpress updates we can do this easily enough. However the problem becomes how to handle the wp-content folder. As people will be publishing posts, and uploading graphics these will need to be synced to the other server in the farm no matter what server to user uploads them to.
I could setup DFS to replicate the files, but that seams like overkill.
I could setup robocopy to run every 15 minutes or something, then tell everyone that posts to be sure to schedule the post to be published in at least 15 minutes so that the files have time to replicate.
Are there any better solutions out there? Perhaps something a wordpress plugin so that when graphics are uploaded to the post they are automatically replicated to the other servers in the farm?
I'm running Wordpress on Windows 2008 so Linux solutions won't help much.
I'm not a wizard with IIS, but hopefully the technique will translate over.
I'm presuming that there's a shared hostname that is load balanced between the two servers, and that there is also a publically accessible name for each.
What you want is a conditional redirect on one of the two servers combined with some kind of file sync. If the URI starts with /wp-content and the file exists, serve it locally. Otherwise redirect to the other server. Server A redirects to B and vice-versa.
This should result in a seamless experience for viewers - they'll just get a temporary redirect for images in the window between the post going up and the sync running. Depending on bandwidth or redundancy concerns, your sync interval could be much longer than 15 minutes, since the site should render properly the moment the post goes up.
In nginx, I'd do this with a block like so:
nginx is available for Windows, but I doubt you want to switch web server software to do this. Hopefully the idea can be converted over to IIS or whichever software you're using.
I'm putting this as a separate answer because it's a different approach:
What about putting the images in cloud storage (Amazon S3 or similar), then having your users use links to the cloud. The bandwidth costs might be a bit higher and there are possibly training issues getting users to upload to the cloud first, but it eliminates the need for local filesystem or cross-servers checks.
It also should scale regardless of the number of servers you deploy.
We've used Super Flexible File Synchronizer for stuff like this in the past. It works really well and has a number of options to control syncing.
Is having the content on a single network file share (no DFS) an option?
How about unison?
You can use rsync for this. Otherwise, if you have files under source control, you could use something like Capistrano to be able to roll things out to different machines (and even rollback if necessary).
When you have more than one machine, being able to deploy and rollback is very useful.
how about robocopy with the following switches: 1. To Detect changes & run sync up - /MON:n :: MONitor source; run again when more than n changes seen. /MOT:m :: MOnitor source; run again in m minutes Time, if changed.
neways, what did u finally use for this issue?