The ISP hosting a mostly static web site complains when certain users start to hammer the site at around 30 hits/second. The machine slows to a crawl, apparently. The files can be simple pages with a few graphics, to fairly large files. There are other sites hosted on the server, so this is not good news.
Should Apache be able to take this load? Are there tips that the ISP can use to tune the server for this? Is there anything to tweak on the pages themselves?
Wow... 30 static requests per second should be nothing for a well-tuned Apache. There's something going pretty badly wrong there. Either the machine is running at capacity already, or it's mistuned. My three primary tweaks for Apache are:
I've got more general "make your webserver handle more capacity" tips in this wiki article from my work, and there's a lot more "tweaking Apache" tips in this devside article.
Honestly, though, if your ISP isn't already up on these sorts of things, it's time to be finding a new webhost. Customers shouldn't have to go asking serverfault for tips to pass onto their hosting company.