Here's my scenario: I'm a developer that inherited (unbeknownst to me) three servers located within my office. I also inherited the job of being the admin of the servers with a distinct lack of server administration knowledge and google / ServerFault as a reference point. Luckily, I've never actually had to come into contact physically with the machines or address any issues as they've always 'just worked'.
All three machines are located within the same data room and serve the following purpose:
Machine1
- IIS 8.0 hosting a number of internal applications
Machine2
- SQL Server 2008 R2 data store for the internal applications
Machine3
- SQL Server 2008 R2 mirror store of Machine2
All three have external hard drives connected that complete back ups frequently.
I've been informed that all three need to move from one data room to another within the same premises. I wont be completing the physical moving of the hardware, that'll be handled by a competent mover.
Apart from completing a full back up of each, what considerations do I need to make prior to hypothetically flicking the power switch and watching my world move?
I'm aware that it's far from ideal having all three located in the same room / premises but that's past the scope of this question.
Genuinely interesting question, well asked :)
There's a few things you need to check before this move, some easy, some hard.
Power - check that the new room has not only the right amount of power outlets but that they're the right type - as in physical connector type and if the current location allows for different power phases per server to protect against single phase failure then I'd strongly urge you to replicate that also in the new location.
Cooling - you need to check that there won't be an immediate or gradual build-up of heat that will lead to overheating and potential server shutdown. You can usually look up the maximum power (in watts) or heat (in BTUs) that each server can draw from the manufacturers website - let your building manager know this and get a written confirmation from them stating that the cooling in that location will cope.
Networking - this is the hard one - not only does the same number of ports need to be replicated between old and new location but so does their type, speed and most importantly configuration. This last point is the key - there was a time when almost all ports in a network were pretty much equal - I'm old enough to remember those times! but these days the number of port configurations and the place in the network that any one port can be in is astronomical, you need to make sure that your network people replicated EVERYTHING to be identical from old to new - again get this in writing as this isn't easy. If something goes wrong with this move I'd put money it'll be on the network ports not being identical, it happens all the time.
'Other connections' - do you know if your servers have any other connections than power and networking? perhaps they have Fibre-Channel links to shared storage, KVM links to a shared management screen - again if they do you need to replicate these identically.
Other than that feel free to come back here with any more specific questions, and I hope the move goes well.
Other answers cover the technical aspects of the move. You may also have to consider some other things.
Make sure users know that their applications will be down during the move. You will want to schedule the move, perhaps during non-working hours, so that you minimize the number of people affected.
Have a knowledgeable person (or persons) test the applications after you bring up the servers. Have them do some sanity checks to make sure the applications work as expected.
After the testing, tell your users that the move is finished and have them let you know if they have any problems.
It's quite difficult to tell and borderline "too broad" for our format. The most important thing you need to check is if you need to reconfigure your network in any way of if they can keep running with the same addresses. Even if they can keep the same addresses, make sure they are not configured via DHCP and/or verify the DHCP server will be available at the new location.
Side note: As you already stated, having the SQL server and it's mirror is far from ideal. However, having the backup drives at the same location is really dangerous. You need to have your backup in a different physical location.
Other answers have good pre-move considerations. However, you should also be planning how you organize the actual move. From the fact that Machine3 is a mirror of Machine2, it looks like uptime is a significant consideration for the SQL Server 2008 R2 database(s). The fact that it is a mirror provides you with an opportunity. The reason for the existence of a mirror is to be available when the primary server is not. That includes not being available due to maintenance, which includes moving.
Make a plan:
You should make a written plan for how the move will be carried out. You may need to be able to provide this plan, or parts of it, to people handling portions of the work (e.g. the movers). This plan should include all pre-move activities, the actual move, and post-move actions (e.g. verification of functionality).
Move Basics:
More detailed description of the move:
The following includes two methods (Path A and B) of using Machine3 to test the connections for Machine1 and/or Machine2. You should only use one method. What way to do so, or even if to use either, depends on information not contained in the question (e.g. physical separation of the final machine locations, physical size of the machines, length of network/power cords, availability of extensions for same, similarity of network port configurations, uptime needs, etc.). Using Machine3 to test these connections potentially allows higher uptime for Machine2, but particularly for Machine1, which has no mirror. You can choose to use either method, or neither.
Move Machine3 first.
Path A: (Optional):
Move Machine2.
[Path B: Not needed if you tested all connections with Machine3 in optional step #2] If now have Machine3 where Machine1 is to end up:
Move Machine1.
If any of the servers' IPs will change then and connections are made to the SQL box via DNS resolution then you will need to schedule a change to the DNS records at the same time as the move.
Things you should know about the intranet software and databases:
If you don't get the exact same IPs, or if you end up on a different sub net, you will need access to change the source code or configuration files for any apps that connect to the SQL server. People could be relying on undocumented and direct SQL access for ad-hoc reporting.
Utilize your "Disaster Recovery" servers. Switch over to them to handle the load while you move your production servers. With properly configured DR equipment you can do the move in the middle of the day without seeing much downtime (up to 15 mins). As the disaster recovery servers should be configured in the same manner as the production servers. If you don't have DR equipment, I highly recommend getting them.
Think of it this way: while your corvette is getting a tune up, use your minivan to get through the day.
One thing I don't think has been mentioned is physical security of the new home of the servers. What was the room used for before and who has keys to it? Is there adequate security (alarm systems, cameras, etc.).
Some considerations in addition to the other answers:
Are the applications linked to other ones by e. g. nightly exchange of data by file or by use of webservices? What are the consequences when the applications are not available? Can related applications cope with this or do they fail or even produce wrong results due to lack of information from your applications?
Is a downtime acceptable for your users, company or even clients? How long may it be?
I think it is a good idea to have a plan for a rollback. You can use it in case of a problem that can't be resolved quickly, e. g. a network problem. You will probably need to keep the mover available for the case of bringing the hardware back.
Do your applications lead to high network traffic and does the network have to be prepared for this (probably much more unlikely a problem than issues with addresses and firewalls)? If you have real time applications (e. g. video conference software) latencies will be important.
The servers must fit into the server rack if you have one.