So ImageMagick has the "convert" command which I use for my (Linux) web platform to deal with image resizing and such.
Sometimes this command gets "stuck", meaning it starts eating machine resources until the entire machine is unavailable. Logging in to the machine can take minutes when this happens.
SO, I'm looking for one of two solutions:
Upon using this command - is there a way to determine just how much max resources it may use?
Can I run a cron script that identifies these "stuck" processes and kill them? I.e. filter on CPU time, or CPU usage and kill when over a specific threshold.
I would probably prefer solution number 2, since solution number 1 could still yield several such processes that each is limited, and stuck, and together they would still eat up my resources.
I can't identify when and why this is happening, the system is slugish and stuck until I "killall convert", then all is well. This command is run thousands of times each hour so unless I would log each and every usage in some way, I can't say which one got stuck, unfortunately.
So, basically, a shell script that identifies, kills and logs when convert gets stuck.
Any ideas?
Just before invoking the convert command you can set a resource limit on the max CPU time the process can use,
convert
then will get killed automatically after exceeding the limit.This will work depending upon how you invoke the command initially. The
exec
may be unnecessary.I have created a script that kills some processes listed in array, if CPU usage greater than XX% for YY seconds or kill processes that running more than ZZ seconds.
You can set XX, YY, ZZ in the top of the file.
You can use a ps or top for check processes.
There's a dry run mode too, to check but not kill.
In the end, the script sends an email if some processes were killed.
Here is my repo on Github: https://github.com/padosoft/kill-process
Here're some screenshots: Sample output