We are running computing jobs with GridEngine. Every jobs returns 3 different times:
- Wall clock time
- User time
- CPU time
What are the differences between these three? Which of these three is most suitable to compare the performance of two applications/scripts
Wall clock time is the actual amount of time taken to perform a job. This is equivalent to timing your job with a stopwatch and the measured time to complete your task can be affected by anything else that the system happens to be doing at the time.
User time measures the amount of time the CPU spent running your code. This does not count anything else that might be running, and also does not count CPU time spent in the kernel (such as for file I/O).
CPU time measures the total amount of time the CPU spent running your code or anything requested by your code. This includes kernel time.
The "User time" measurement is probably the most appropriate for measuring the performance of different jobs, since it will be least affected by other things happening on the system.
From Wikipedia:
Wall clock time is the time you would get if you measured the runtime with a stopwatch. User time is the amount of time the CPU takes for running exclusively the code in your job (this does not include system calls you job may do). CPU time is the amount of time the CPU takes on actively running your code and including possible system calls.