My scripts are using wget to retrieve data from Internet. When many users are using this script I got very high load (about 20.00), because of disk I/O. Wget is automatically started each hour by cron. I would like to limit one wget to one customer at the same time. How to do this ?
I'm using CentOS 5.7 64-bit.
You could add a file lock to your script. For example with flock command (package util-linux on Debian):
Check out the options available to you in
/etc/security/limits.conf
. You can set per-user process and memory limits there.If it is the same request and content over and over, just cache the result for a few seconds and serve up the same content for different users.
You could use memcache to implement this easily enough.
If the content differs per user be aware that queueing like you ask will cause long delays.