I have a rather old server that has 4GB of RAM and it is pretty much serving the same files all day, but it is doing so from the hard drive while 3GBs of RAM are "free".
Anyone who has ever tried running a ram-drive can witness that It's awesome in terms of speed. The memory usage of this system is usually never higher than 1GB/4GB so I want to know if there is a way to use that extra memory for something good.
- Is it possible to tell the filesystem to always serve certain files out of RAM?
- Are there any other methods I can use to improve file reading capabilities by use of RAM?
More specifically, I am not looking for a 'hack' here. I want file system calls to serve the files from RAM without needing to create a ram-drive and copy the files there manually. Or at least a script that does this for me.
Possible applications here are:
- Web servers with static files that get read alot
- Application servers with large libraries
- Desktop computers with too much RAM
Any ideas?
Edit:
- Found this very informative: The Linux Page Cache and pdflush
- As Zan pointed out, the memory isn't actually free. What I mean is that it's not being used by applications and I want to control what should be cached in memory.