I am not a pro in web development and Apache server still remains a mystery to me. we've got a project which runs on LAMP, pretty much like all the commercial hosting plans.
I am confused about one problem : does modern browsers support image loading in parallel? or this requires some special feature/config set up from server side? Can this be done with PHP coding or by some server-side configuration? Is a special content delivery networking needed for this?
The benchmark demonstration will be the flickr website. I am too suprised to see how all image thumbnails are loaded in a short time after a search as if there were only one image to load.
Sorry I cannot present any code to you... completed lost in this:(
Do you mean multipart responses?
Those are not supported widely enough. See this question on SO.
However, Flickr I think uses simple Ajax to achieve their goal. They don't need to re-load the page, they just fetch the image using JavaScript. All major Javascript frameworks have a full suite of Ajax functions, for example jQuery.
If you mean plain old parallel requesting of various images, the number of parallel requests is usually limited on both the client and the server end. A very popular trick is to spread resources to as many different hosts as possible.
Modern browsers will try to download in parallel the resources for a page. In Firefox, type
about:config
and filter forconnection
to see what level of concurrency currently set. Older browsers had lower settings (I think IE6 had 4 or 6 concurrent connections).I never tried this but I have read on this topic about using multiple subdomains to push content, that should allow the browser to download more in parallel because it assumes it coming from multiple sources.
http://papermashup.com/using-subdomains-to-speed-up-your-site/ google search for more on this topic. (What I said pretty much agrees with what both Pekka and Dan told you.)