We have just started using more javascript in the web interface of an internal application(php if it matters). Now that the changes are in place it is becoming very obvious that the more javascript on the page, the slower the page loads through squid.
Any suggestions on why this is happening? I don't want the question to be to vague but I don't want to suggest somthing when I don't know what I'm looking for.
One thing that occurred to me: What if the pages without javascript aren't getting cached, and our squid server is secretly slow? How do I test this?
Please, enlighten me!
Update 1 All of the javascript is cached, and being pulled from the proxy server. The largest chunk data wise (~60k) is the generated html and that is a miss every time.
Update 2 There is no ajax, the javascript is confined to a floating toolbar and handles some text pre-parsing for a search feature, its simple rule based "If it has x many characters look for a matching order number" kind of thing.
Upon closer inspection all the cached javascript is checked if its the newest version before being sent on. Triggering a TCP_REFRESH_HIT/304
I have a feeling this may be my bottleneck.
Inspect your squid access log and look for TCP_HIT. This will tell you what pages are getting delivered from cache vs being forwarded on.
From the client, you can't be absolutely sure if you're connecting via the squid. You might have the X-Forwarded-For header present.
Also, try to access the site directly. Is it fast, then, or is it just your browser itself being slow due to the whole JavaScript business?
What exactly do you mean by "started using more javascript"? If you're putting in AJAX with numerous simultaneous calls/retrieves then you might be hitting the limit for simultaneous client connections.