I have a website hosted on 2 servers. The website is SSL based.
I'd like to monitor (load the main page and look for a certain string) the website per server using a local Opsview. Meaning I need to go Opsview's HOSTS file and add "domain.com 10.10.10.33" and change it each time to match the right server. I obviously cant script this as the results are very likely to be skewed during the check.
Is there some sort of a crawler for Linux that's capable of taking an IP address, domain name and work them together? I tried both curl --proxy and wget --header to no avail.
Many SSL http servers don't care what Host: header is supplied, so simply requesting https://$ipaddress/foo should work, as long as you can convince the client (e.g. wget) to ignore the certificate CN mismatch.
Otherwise try something like:
You don't need the hostname, you can connect by using the IP alone and pass the domain using
curl
.The
-k
will make cURL ignore certificate errors, the-H
will put the host, and curl download the result. You can pipe it overawk
,grep
or any other tool you have.Have you tried using curl to connect to the IP address and overwriting the host?-
I eventually found the solution without cancelling out the certification warnings and such.
This requires the latest CURL which is 7.37.1
The right syntax is: