What is the best way to execute 5 curl
requests in parallel
from a bash script? I can't run them in serial for performance reasons.
What is the best way to execute 5 curl
requests in parallel
from a bash script? I can't run them in serial for performance reasons.
Use '&' after a command to background a process, and 'wait' to wait for them to finish. Use '()' around the commands if you need to create a sub-shell.
xargs has a "-P" parameter to run processes in parallel. For example:
Reference: http://www.commandlinefu.com/commands/view/3269/parallel-file-downloading-with-wget
I use gnu parallel for tasks like this.
Here's a
curl
example withxargs
:The above example should
curl
each of the URLs in parallel, 10 at a time. The-n 1
is there so thatxargs
only uses 1 line from theURLS.txt
file percurl
execution.What each of the xargs parameters do: