My application sits behind a load balancer, and every once in a while I like to do a status check on each machine to get an idea of the time it takes to return an index.html document on each machine.
The script looks like this:
for host in 192.168.0.7 192.168.0.8 192.168.0.9; do
result=$( ( time wget -q --header="Host: domain.tomonitor.com" http://$host/ ) 2>&1 | grep real | awk '{print $2}' )
date=$(date)
echo "$date, $host, $result"
done
Since the application thinks it's on domain.tomonitor.com
, I set that manually in the wget request header. It grep
s for the "real" time and awk
s out the time alone, dumping that into a $result variable. Empirically, it seems to work pretty well as a basic manual check -- responses typically take 2-3 seconds across my various servers, unless there's some unbalanced connections going on. I run it directly from my Mac OS X laptop against our private network.
The other day I wondered if I could log the results over time using a cron. I was amazed to find it had subsecond responses, for example .003 seconds. Tried mounting the script results to my Desktop with an OS X desktop widget called Geektool and saw similar, sub-second times reported.
I suspect the difference is due to some user error -- some reason why the time wget
command I'm running won't work. Can anyone tell me why the time it takes to run this script differs so much between user (me running by hand) and system (cronjob or Geektool) and how I might correct the discrepancy?
You don't show your shebang line, but based on what you're grepping for, I'd say you're running this under Bash. If you don't have a shebang line, you should add one. The Bourne shell doesn't have a built-in
time
command so it will use/usr/bin/time
which has a different output format than Bash's built-intime
.Since you're using Bash, you can set the output format of the
time
command using theTIMEFORMAT
variable so you don't need to usegrep
andawk
. I would use curly braces to avoid any overhead that creating a sub-shell might add.I'm not familiar with Geektool so I don't know how it affects your results. However, the changes above may make the script work more consistently between environments. Have you considered whether the connectivity is simply that much better for the server?
Another thing to check is to see whether you're getting the expected response to your
wget
command. Times that small sometimes indicate that you're getting an error. Running the script incron
may mail you the error, but you can log it by making the following change:which will put the output and error messages from
wget
into a file called "/tmp/wget.PID.out" where "PID" is a numeric process ID. The output fromtime
will still go to the variable.