I currently have an Apache2 server running with mpm-prefork
and mod_php
on a OpenVZ VPS with 512M real / 1024M burstable RAM (no swap). After running some tests, I found that the maximum process size Apache gets is 23M, so I've set MaxClients
to 25 (23M x 25 = 575 MB, ok for me). I decided to run some load tests on my server, and the results left me puzzled.
I'm using ab
on my desktop machine requesting the main page from a wordpress blog.
When I run ab
with 24 concurrent connections, everything seems fine. Sure, CPU goes up, free RAM goes down, and the result is about 2-3s response time per request.
But if I run ab
with 25 concurrent connections (my server limit), Apache just hangs after a couple of seconds. It starts processing the requests, then it stops responding, CPU goes back to 100% idle and ab
times out. Apache log says it reached MaxClients
.
When this happens, Apache keeps itself locked up with 25 running processes (they're all in "W" if I check server status) and only after the TimeOut
setting the processes start to die and the server starts responding again (in my case it's set to 45).
My question: is that expected behaviour? Why Apache just dies when it reaches MaxClients
? If it works with 24 connections, shouldn't it work with 25, just taking maybe more time to respond each request and queueing up the rest?
It sounds kinda strange to me that any kid running ab
can alone kill a webserver just by setting the concurrent connections to the servers MaxClients
.