23

What is the best way to execute 5 curl requests in parallel from a bash script? I can't run them in serial for performance reasons.

Theuni
  • 938
  • 5
  • 14
Justin
  • 5,008
  • 19
  • 58
  • 82
  • 1
    Have you tried searching for parts of your solution? Another SF question seems to be exactly what you're asking for: http://serverfault.com/questions/248143/multithreaded-downloading-with-shell-script – Theuni Dec 09 '12 at 09:47
  • http://stackoverflow.com/questions/8634109/parallel-download-using-curl-command-line-utility – Ciro Santilli OurBigBook.com May 26 '16 at 20:43

4 Answers4

34

Use '&' after a command to background a process, and 'wait' to wait for them to finish. Use '()' around the commands if you need to create a sub-shell.

#!/bin/bash

curl -s -o foo http://example.com/file1 && echo "done1" &
curl -s -o bar http://example.com/file2 && echo "done2" & 
curl -s -o baz http://example.com/file3 && echo "done3" &

wait
Anton Cohen
  • 1,112
  • 6
  • 7
  • Simple, but effective for a first step. Gets hacky quickly when things need to change, like hostname or number of repetitions. Thanks. – Chris Oct 16 '18 at 13:07
10

xargs has a "-P" parameter to run processes in parallel. For example:

wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv

Reference: http://www.commandlinefu.com/commands/view/3269/parallel-file-downloading-with-wget

Fan___
  • 201
  • 2
  • 3
6

I use gnu parallel for tasks like this.

Dennis Kaarsemaker
  • 18,793
  • 2
  • 43
  • 69
  • 4
    Could you provide an example for calling `curl` with `gnu parallel`? – m13r Jan 18 '18 at 12:34
  • Yes, parallel seems very good and it's easy to send the same request 100 times. But an example on how to use parallel with sending 100 different curl requests would make this answer better. – рüффп Mar 27 '19 at 09:36
  • 1
    For example: https://gist.github.com/CMCDragonkai/5914e02df62137e47f32 – mirrorw Apr 09 '19 at 09:35
0

Here's a curl example with xargs:

$ cat URLS.txt | xargs -P 10 -n 1 curl

The above example should curl each of the URLs in parallel, 10 at a time. The -n 1 is there so that xargs only uses 1 line from the URLS.txt file per curl execution.

What each of the xargs parameters do:

$ man xargs

-P maxprocs
             Parallel mode: run at most maxprocs invocations of utility at once.
-n number
             Set the maximum number of arguments taken from standard input for 
             each invocation of utility.  An invocation of utility will use less 
             than number standard input arguments if the number of bytes 
             accumulated (see the -s option) exceeds the specified size or there 
             are fewer than number arguments remaining for the last invocation of 
             utility.  The current default value for number is 5000.