I can't find anything that specifically elucidates the number of concurrent connections that are reasonable - nor can I find any treatises or research on this topic.
The boss says "The server needs to handle 50,000 concurrent users - so measure with 50,000 concurrent connections."
Then I see in at least one popular blog that measuring 50 concurrent connections is equivalent to being Slashdotted or Farked.
What are the numbers? What is reasonable? What isn't? How do the concurrent connections translate to number of users online at any given time?
What is reasonable depends on what kind of traffic your server is getting. A typical web server that handles about 1 million page requests per day (and that's quite a busy little server) will rarely ever need more than 50 concurrent connections, unless your page processing is really heavy.
If your average page processing time (i.e. the time it takes from receiving the HTTP request to the time when the initial request is completely dealt with) is around 300ms (and that's quite a lot and will probably get you complaints from your users), then with 50 concurrent connections this server can handle approx. 3 x 3600 x 50 = 540k requests in one hour. However, you need to consider that each picture is a separate request, and so are JavaScript files and CSS files. On the other hand, a lot of these are cached by browsers, so they only get hit once.
Still, that should be good enough for well over 100k pages per hour. Is that enough for you? If not, up the figure.
You should try siege.
Siege doesn't simulate connections, but users. More appropriate for your purpose.