I have few questions about caching a website, since I haven't tried caching a site before.
First is how to cache a site so it will have a faster loading when clients browse the site. For example I have many images in my css style, how do I cache this?
From what I've read before caching in php is done through the <head>
tag and also caching can be done in the .htaccess
(glad I'm using htaccess ^_^).
I added this tags in my header
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1" />
<meta http-equiv="expires" content="-1">
<meta http-equiv="pragma" content="no-cache">//or content="cache"???
Also this is what's inside my htaccess:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} ^system.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
RewriteCond %{REQUEST_URI} ^application.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php?/$1 [L]
</IfModule>
<IfModule !mod_rewrite.c>
ErrorDocument 404 /index.php
</IfModule>
Caching is rarely a 'drop in' solution, but if implemented correctly it can result in a significant speed increase for a website.
The first item you mention 'a lot of CSS images' - is not a server issue per se. To fetch those, the browser must make a request for each image, each with its own overhead. Combine the images into a CSS sprite so that you can minimize the number of requests and maximize the compression that your chosen image format applies. A similar idea applies to CSS and JS files - combine and minify to the extent possible. For website specific recommendations, try Google's Pagespeed or Yahoo's YSlow.
On the server side of things, there are a number of cache's for PHP that will store the intermediate forms of the translated script - opcode caches. The most commonly used one is APC, but xcache and eaccelerator are often used as well (use only one of them).
From the PHP side of things, caching typically involves generating a static copy of your content. This can be achieved using output buffering (the ob_* functions (e.g. ob_start()). You start buffering before sending any content, end your buffering after your page has been fully constructed and save the resulting content to a file. You then check for the existence of such a file before trying the PHP script (i.e. if the cache exists, serve that, otherwise load the script).
Additionally, you can add an additional layer to your web stack - Varnish is a popular choice. It is a reverse proxy and caching server. Essentially, requests to your backend server are received by Varnish, and if it already has a copy of the page, it will serve it directly - bypassing the backend. If it doesn't have a copy of the request, it will pass it to the backend. Varnish can be configured to store its cache directly in memory, or can use disk-backed storage, and is quite efficient at what it does.
As with any approach to caching, you face the potential problem of leaking sensitive information - if you cache a page that should only be visible to logged in users and end up serving it to an unauthenticated user, you have a problem. As such, a server like Varnish tends not to cache any requests containing cookies - something that is important to consider in trying to actually achieve a high hit rate.
The other possible consideration is to use a content distribution network - a service that has network edge locations geographically closer to your users so that multiple requests can be served faster - usually though, this is more of a consideration for large files - if you are serving many little files (e.g. images) you can probably improve your site design.
As stated by David, your question is very broad.
However, after having tried to read your mind, it seems to me that what you want is to use the HTTP caching features. I believe you want the users's browsers accessing your web site to cache its images as well as its .css files, so that they avoid fetching all this content every single time they access your site and thus making it load faster.
Well, HTTP has features that allow you to do just that.
Beginning with the basics, HTTP is a request-response protocol. The browser sends a request for a resource and the server responds. Both requests and responses contain headers, which describe how the browser and the server must behave when they receive responses and requests, respectively.
You can tell your users's browsers to cache images and .css by setting response headers in your .htaccess file.
The headers you'll need in your responses to achieve this goal are the ones described below:
After you understand this and pick the headers that you believe will best fit your needs, you can do this by using apache mod_headers and the FilesMatch directive.
Here's an example of what you can put in your .htaccess (it might be exactly what you want):
Before you do that though, I suggest you to download google's Page Speed plugin for chrome (http://code.google.com/speed/page-speed/download.html) and install it. Then, run the page speed test and take a look at its results, specially the section "Leverage browser caching" in order to see whether you really need to take an action.
Controlling the caching of static html files with meta tags is not really steady, at least in my opinion. You should have a look at mod_expires (using HTTP header for cache control): http://httpd.apache.org/docs/2.0/mod/mod_expires.html
If you want to cache dynamic sites, you can cache the whole page or just the database requests (depends on your database/site design). Maybe this link is helpful: http://blog.digitalstruct.com/2008/02/27/php-performance-series-caching-techniques/
Your question is very broad. "Caching" is done on several different layers, so asking "how do I cache my website" is a bit like asking "how do I cook dinner". There are many ways to do it.
You could cache on the database layer if you have a dynamic website with a database such as MySQL. You could cache on the PHP layer with memcache or APC. You could cache on the server layer with a reverse proxy such as Varnish.
And these don't even touch client-side caching (i.e. each client's web browser caching content for that specific person), which is what you're talking about with .htaccess and header information.
You will be better off focusing on server-side caching, instead of client-side caching.
Building your caching architecture depends heavily on what your application is built off of, how often the content changes, and how up-to-date information needs to be.
If you have a dynamic website, I would first recommend studying how you can increase MySQL's performance with its own built in caching mechanism.
If you have a very busy website, then I would recommend studying how Varnish or another reverse-proxy caching server works.
Either way, look over these topics, learn, and feel free to come back when you have more specific questions.