Our company manages over one hundred servers and we would like to "ask" these servers for basic usage info once or twice a day using http. The usage info can be easily found with a perl cgi script and we would like to have an http interface to ease the creation of scripts and testing them. It seems an overkill to have apache, or even nginx+fcgiwrap, to serve one or two requests per day. We were thinking of using openbsd-inetd (which is already installed in all the servers) to launch a web server that could easily pass the request to the perl cgi script. What are good alternatives to do this?
Have you considered using central web server executing via SSH custom scripts on the queried servers?
In many environments SSH is used anyway for remote management. SSH offers option to auto-execute specific local command for connection authenticated with specified key.
IMHO it is an option worth to seriously consider if SSH is deployed anyway.
If you write a perl script, just have a look to Catalyst, or Mojolicious, or Dancer. They are micro framework able to listen on a port to serve few HTTP requests. So you will not be forced to install a web server in front of your script.
I used script to collect data, zip it and send archive via scp (pub key ssh auth) to log server.