I have a web server image that I am responsible for building across multiple servers. I have a list of about 50 URL's that I am supposed to go to and confirm the correct content is showing up. Which automated tools exist to do this easily (without writing a bunch of curl requests and regexes in a script file) .
I have my doubts that you'll find anything easier than curl (or wget) and a few lines of
$SCRIPTING_LANGUAGE_OF_CHOICE
. Seriously, it's about 5 minutes work. In Ruby (slightly complicated by the fact thatNet::HTTP
is furgly):While I agree with the sentiment that curl/wget and scripting can get the job done well, if you are really looking for another way, consider penetration testing tools. Especially if you need to interact with the site, for example to log in to the site, these tools will allow you to automate your testing. Of course, you will still need to do a little work to configure/script the tool to do what you need.
This people offer a service http://siteuptime.com, I do think you don't need them but it could be useful.
You could make a PHP (Or in the language you're websites are up) to check status on services.
check status > wget > grep