Take this (ServerFault) page for instance. It has about 20 elements. When the last of these has loaded, the page is deemed "loaded"--but not before. This is certainly the protocol used by our testing service (which is among the small group of well-known vendors that offer that sort of service). Obviously this method is based on a clear, definite endpoint--therefore it's easy to apply w/ concomitant reliability. I think it's also the metric used by the popular Firefox plugin, 'YSlow.'
For my employer's website, nearly always the last-to-load items are tracking code, tracking pixels, etc., so from the user's point of view--their perception--the page was "loaded" well before it had actually loaded based on the criterion used by our testing service (15-20% is a rough estimate).
I'm sure i'm not the first person to consider this nor the first to wonder if it is causing micro-optimization while ignoring overall system-level, or user-perceived performance. So my question is, are there are other more practical (yet still reasonably precise) measures of page loading time?
there are two good tools i know off for measuring website performance:
yslow by yahoo and page speed by google.
these tools will give you a good overview where your page is spending time and give you some hints how to do it better.
here are also some good blogs about page performance:
High Scalability
High Performance Web Sites
in these blogs you might get some new perspectives and ideas about website performance.
EDIT: here is an article which discuses performance.
EDIT2:
it seems it is now becoming even more important, as google calculates page rank also by speed: http://searchengineland.com/google-now-counts-site-speed-as-ranking-factor-39708
EDIT3:
here is a page with numbers correlating speed and business from google, bing, yahoo, mozilla and some others.
Since it is 'perceived load time' and not 'actual load time' that is arguably the most important metric, it is difficult to measure precisely and consistently, specifically since it all depends on perception - which can vary according to the user and the nature of the page in question.
For example, I'll frequently fire up an informational page, whereby I can be happily reading the content long before the page is fully loaded. Equally, when I want to log into any number of websites, I have my username and password stored by the browser... but frequently, the page appears to have loaded several seconds before my stored username/password are automatically populated by the browser - clearly the page wasn't fully loaded when it appeared to be.
My point is, the point at which I can progress with what I want to do is partially determined by the nature of the page in question - I don't see how you can automatically determine the point at which a page could be considered usable.
If you need a consistently measurable metric, you can stick with what you have. If you want a more accurate metric (the point at which a page can be considered usable), it will probably require a human judgement.
Sounds to me like you need a script to download the web page but skip anything linked to something outside the base URL. That would give you the load time for the page in a way that actually means something in regard to optimisation. I don't know of such a script off hand though.
Its not a black and white question IMO. For you on a fast PC with a broadband connection and a modern web browser, waiting for some secondary element to load may not be a big deal in terms of "perceived page load time". For you the difference is marginal.
But for the guy at a corporate branch office with a bonded T1 shared by 100 people, running Internet Explorer through a centralized proxy that runs everything through a security appliance (like McAfee, WebRoot, Finjan, etc) on each page load, things are different. The difference between "the page looks loaded" and when it is actually loaded could take seconds -- a big deal. Sometimes security appliances won't deliver the page until everything is done loading.
You should be demanding that your developers or vendors deliver quality services. If its taking 5 seconds to load a web advertisement, there is no ad in front of your visitor's eyeball.
There are a lot of tools available for you. When you visit the website, you can get the time to download each element and figure out the problems for the slow. Try network monitor