What effect can a latency of 100 ms have on a page load time? If a 100 KB file is served from two locations with latencies of 100ms and 200ms respectively, will the difference in load time be just 100ms, or will the second request take double the time to load (assuming that the networks are equally optimized, and all other network parameters are the same)?
Assuming you're talking about a 100ms/200ms distance between a client and the web-page, it depends on the page.
If it's a single big file, the difference in load times will be 100ms.
If it's a modern web-page which requires 10 to 40 individual requests to the server in order to load the page, the difference will be a lot higher. Exactly how much depends on the capabilities of the browser and server (http pipelining will help, assuming http) and the serialization of what gets loaded when as determined by the browser. It could be as much as twice as long to load.
The difference for a page like that will be 100ms.
For a page that loads multiple CSS files, one or more javascript files, and does dynamic loading of content based on what the javascript tells it to do, will load a lot slower.