We are in a tough spot, we have a hosted server with the following specs:
OS: Windows Server 2008 R2 Enterprise SP1 64bit
Processor: Intel Xeon X7550 @ 2GHz (8 processors)
RAM: 16GB
The file system is on a SAN or NAS (not sure).
We are seeing very odd issues where a user will open a 25MB .xslb file and it takes literally 60-120 seconds sometimes. The server is just dog slow for excel.
Resources are not being pegged, CPU never jumps up, plenty of RAM... it's just oddly slow.
Our host has been looking at the issue for several weeks with not much to show for it. Is there a utility I can run myself that will help trackdown our issue?
I have found Server Performance Advisor V1.0 Any experience in using it?
Our host is ultimately responsible for fixing this, but we are going on 1 month and our users are losing patience. Any tips would be helpful.
(do you mean xlsb, rather then xslb?)
For a quick view of "are my disks able to keep up with my requests": Open up Performance Monitor and add AverageDiskQueueLength counter for that drive (logical or physical are fine). In general, it shouldn't spike above 10: a gross generalization but will help you see the good from bad. Disk Queue Length should normally be 0 to 1 but on really overworked servers I've seen it climb to 100's of thousands and stay there. This is just a counter of how many requests are waiting on disk I/O to accept them for processing.
But why look at disk I/O at all? You know the thing is slow, so to me looking at perfmon is to rule out the disks as the issue. Disk Latency counter should likely stay under the magical 20ms.
For a more comprehensive look, I'd use PAL (Performance Analysis of Logs), a fantastic tool where you run it, tell it what type of "workload" your running, and it spits out a config file for performance monitor. Import that file into perfmon, run for 24hr on typical day, then take log output and import those into PAL which will spit out a nice HTML summary report using MS Best Practices and real-world info to help you chase potential issues. Here's more to think about:
Why are you sure it's the server that's responsible for this slow performance?
The last time I had this "issue" reported to me it was a 40 MB Excel file with over 200 pivot tables (and a dozen external links) that had to be calculated on the client machine every time it was opened. Go to a workstation, download the Excel file, open task manager and watch the local machine's CPU load while opening the file.
In my case, the file took 87 seconds to open, with full processor load on a dual CPU, quad core i7 system. Once I confirmed that, I pushed the problem back to where it belonged, which would be the moron who created a spreadsheet with so many entries and calculations it could choke a mainframe. I'd bet good money the problem's the same in your case. The spreadsheet is too beefy for the client PCs to process in a timely manner, which is what happens when end-users try to do anything remotely technical - they don't do it remotely right, and everyone suffers as a result.