I wanted to stress test by webserver. Here is how i want to do it.
I have 'u' different urls on my webserver. so i need to hit these urls in parallel, with a hit rate of 'n' hit per second, per url.
what would be the best tool to do this ?
I wanted to stress test by webserver. Here is how i want to do it.
I have 'u' different urls on my webserver. so i need to hit these urls in parallel, with a hit rate of 'n' hit per second, per url.
what would be the best tool to do this ?
Your title suggests that you have jmeter which is how I would do it. Create a test plan that has all the URLs in a Random group with a percentage matching your usage statistics (if you have any, otherwise just stick with an even distribution). Then set up a user to hit pages every n seconds (depending on how much text is on each page users will stay there between 4 and 40 seconds) and set it up to ramp up to a very large number of users. The only problem with jmeter is that it will sometimes overwhelm its own host before it overwhelms the webserver. Also, you need the machine you are testing and the machine you are testing with to be on the same LAN segment to take out the possibility that bandwidth is affecting your results.
Try
httperf
utility.Here is its usage example:
This asks foo.bar 100 times for URI /test with fixed rate 10 connection per second. If you want to ask different URI's, run many httperf instances in parallel mode with different --uri parameters.
Apache JMeter is free and open source tool, it's one of the most powerful performance testing applications, it has GUI, provides support for multiple protocols, record-and-replay functionality, assertions and reporting tools out of the box, it can be extended with custom samplers and plugins, lots of existing plugins (see the list at http://jmeter-plugins.org/wiki/Start/ has big community.
To specify exact load in "requests per second" you can use Constant Throughput Timer