Apache's AB tool allows us to test our server to see how many pages per second a server can handle. The results are obviously is going to vary based on the specs of the server hardware and software configuration. My question is: how do you know when your server's results are too low/can be improved? Or are acceptable/optimized?
Benchmarking is such a complicated topic...
My points.. use a different amount of concurrency in your tests
1-2-8-16-64-256-512-1024-2048
And push lots of requests.. until
You start seeing errors.. returned by AB...
Then you starting to find your limits of your application + webserver..
Keep in mind also your RTT and Delay..
Remember to run your tests for a little while to get a good average..
I should also note that. You can easily use all the CPU on the server your testing with.. than the server your actually testing.. So keep an eye on cpu and memory on your testing machine..
Hope that Helps :D
If your results are even near the general ballpark figure for your scenario (at least thousands of requests per second for small static files, at least hundreds of requests per second for simple dynamic PHP, perhaps less for heavy dynamic content with databases and stuff) should have and the server is still chugging along nicely, then you're OK.
It's also important that during the load test server still has room to breath. If it barely has any CPU time or extra memory left, then you're gonna be in big trouble during sudden traffic spikes and/or growth of your site.
Trust your instincts. If you have just a small database for the content and during the benchmark the database server seems to eat lots of CPU, go and double-check everything. Is the database server itself configured OK? Are all the expected indexes in place? Is the site performing some half-assed SQL queries which could be optimized and/or cached? Something else? Or if your web server seems to spend a lot of time even serving out static content, check the web server settings.
And if you happen to benchmark a completely slow-as-molasses site, you WILL know there's room for optimization. I once benchmarked a seemingly simple site running with an 8-core server, 16 GB RAM and all that jazz. ab, siege and JMeter all gave me five requests per sec! At that point I knew something was very, very badly wrong. Ironically enough, it was totally misconfigured cache-plugin accessing memcached that was wrong. After fixing that the site was around 100 times faster. :-)
In my mind it's always about volume of users and their expected/anticipated experience. Your use case will vary but I'm sure if Google started responding at the speed of say Expedia people would be disappointed. So ultimately it's all about how does your system react when faced with a certain volume of traffic - and we can't tell you that data, you need to know yourself.