Not going into specifics on the specs since I know there is no real answer for this. But I've been doing load testing today with the ab
command in apache.
And got to the number of 70 requests per second (1000 requests with 100 concurrent users), on a page that is loading from 4 different DB tables, and doing some manipulation with the data. So it's a fairly heavy page.
The server isn't used for anything else for now and the load on it is just me, since it's in development. But the application will be used daily by many users.
But is this enough? Or should I even worry (just as long as it's over X requests a second)
I'm thinking that I shouldn't worry but I'd like some tips on this.
70 requests per second works out to an hourly rate of 252,000 page renders / hour.
If you assume that the average browsing session for your site is 10 pages deep, then you can support 25,000 uniques / hour.
You should probably check these numbers against your expected visitor count, which should be available from the folks on the business side.
Many of the sites I work on see about 50% of their daily traffic in a roughly 3 hour peak period on each day. If this is the case with your site (it depends on the kind of content you provide, and the audience), then you should be able to support a daily unique visit count of around 150,000.
These are pretty good numbers; I think you should be fine. It's wise to look into opcode caching and database tuning now, but remember- premature optimization is the root of all evil. Monitor the site, look for hotspots, and wait for traffic to grow before you go through an expensive optimization effort for a problem you may not have.
I have used 2 tools to watch the performance of my apache servers in the past.
One is munin, which graphs all sorts of things including number of apache instances, number of connections, available memory, processor usage, etc - and helps me determine when I am approaching a danger zone, and why.
The second one is simply the apache server-status page (http://your_server/server-status?refresh=10) which lets me see the state of each connection, along with how many free connections are available at any given moment.
I'd suggest you worry only if you think your app will be very busy when it hits the ground. Is the page in question likely to be hit that hard? Harder? Less? If you have no idea, I would suspect that it's unlikely to be a problem earlier on. If it's your slowest page you'll know one place to look if you have to optimize the system later.
There are also lots of things you can do to tune most web servers and database engines to squeeze more performance out.
You state in a comment that your server can handle 2,900 requests per second on an empty page. That indicates pretty strongly that it's not the webserver itself - it's the processing.
If you're using PHP, consider an opcode cacher like APC. If the DB is a bottleneck, memcached will help you as well.
Once you put your site live you could also look at mod_top1 which will give you a real-time view of current load on Apache. I've not installed it myself but it certainly seems to have more information and a better breakdown of load than the standard Apache server status.