I have a test environment for my Asp.Net 4.0 web applications. It runs on a Dedicated Windows 2003 Webserver (IIS 6) But it's slow! The machine never has 100% cpu and the same for memory. When I run the application on the production server, it's fast as hell.
I think that it has to do with the horrible bandwith the testserver has. It uses a remote SQL server. How can I check if the remote SQL is the bottleneck? And how can I check if the upload of the machine is the bottleneck?
Thanks in advanced, JP
Edit: The network speed: Average ping from test to database is 33ms. www.Speedtest.net gives testserver: 10mbit down/0.8mbit up. Remote databaseserver: 94mbit down/74mbit up. But 0.8mbit up (testserver) is about 100 kilobyte per second. Should be enough for several simple sql queries? or not?
I have used test environments setup the same way in the past on a similar speed connection to yours and it was barely noticeable that the SQL Server was somewhere else (testing only, never tried heavy usage).
The only exception is when a ton of data is being transferred to the app rather than small amounts (you can see the same effect if you remote desktop onto the remote server, run a query with a lot of results and the results are returned within 1-2s - run the query remotely and the download takes 3x/4x/5x that depending on your connection).
Can you add a couple of date counters to your web-app to report time taken to connect to the SQL Server, time taken to get results, time taken to display page? Or you might be able to get the same using SQL Trace? (and get a comparison for the production server)