I need a calculation to work out the downtime percentage of a server.
I am making a script that runs via cron every minute to check the uptime of a remote server.
The two values I have to play with are number of checks run and times the checks failed (outages).
Is this a plausible way of calculating it? I am thinking it must be but can't be too sure as my Maths skills are slipping away from me with age!
Here's a simple spreadsheet layout to calculate and keep for historical purposes. You can set goals and apply conditional formatting to your hearts content.
Errrm,
100*failures/(failures+successes)
, or even simpler100*failures/total-checks
?Here's a modification of Ben's answer that doesn't need an ugly "Minutes in month" column.