23
9
Sequential: for i in {1..1000}; do do_something $i; done
- too slow
Parallel: for i in {1..1000}; do do_something $i& done
- too much load
How to run commands in parallel, but not more than, for example, 20 instances per moment?
Now usually using hack like for i in {1..1000}; do do_something $i& sleep 5; done
, but this is not a good solution.
Update 2: Converted the accepted answer into a script: http://vi-server.org/vi/parallel
#!/bin/bash
NUM=$1; shift
if [ -z "$NUM" ]; then
echo "Usage: parallel <number_of_tasks> command"
echo " Sets environment variable i from 1 to number_of_tasks"
echo " Defaults to 20 processes at a time, use like \"MAKEOPTS='-j5' parallel ...\" to override."
echo "Example: parallel 100 'echo \$i; sleep \`echo \$RANDOM/6553 | bc -l\`'"
exit 1
fi
export CMD="$@";
true ${MAKEOPTS:="-j20"}
cat << EOF | make -f - -s $MAKEOPTS
PHONY=jobs
jobs=\$(shell echo {1..$NUM})
all: \${jobs}
\${jobs}:
i=\$@ sh -c "\$\$CMD"
EOF
Note that you must replace 8 spaces with 2 tabs before "i=" to make it work.
2One more option:
xargs --max-procs=20
. – Vi. – 2015-12-28T10:35:10.267Haven't know about "moreutils" and that there's already a tool for the job. Looking and comparing. – Vi. – 2010-07-27T16:26:15.037
1The
parallel
in moreutils is not GNU Parallel and is quite limited in its options. The command above will not run with the parallel from moreutils. – Ole Tange – 2010-09-28T22:49:41.827