This is a simplistic use of the pattern that I wrote about in my last post to wait on multiple commands in bash. In essence I have a script which runs a command (like uptime or restarting a daemon) on a whole bunch of servers (think pssh). Anyways… this is how I modified the script to run the command on multiple hosts in parallel. This is a bit simplistic as it runs, say, 10 parallel ssh commands and then waits for all 10 to complete. I’m very confident that someone could easily adapt this to run at a constant concurrency level of $threads… but I didn’t need it just then so I didn’t go that far… As a side note, this is possibly the first time I’ve ever *needed* an array in a bash script… hah…
# $1 is the commandto run on the remote hosts
# $2 is used for something not important for this script
# $3 is the (optional) number of concurrent connections to use
if [ ! "$3" == "" ]
then
threads=$3
else
threads=1
fi
cthreads=0;
stack=()
for s in $servers
do
if [ $cthreads -eq $threads ]; then
for job in ${stack[@]}; do
wait $job
done
stack=()
cthreads=0
fi
(
for i in $(ssh root@$s "$1" )
do
echo -e "$s:\t$i"
done
)& stack[$cthreads]=$!
let cthreads=$cthreads+1
done
for job in ${stack[@]}; do
wait $job
done
Thanks for bringing "wait" to my attention. I needed something to run a couple of processes at once, making sure that as soon as on process ends a new one is filling up its spot in the queue. The result using wait and jobs is here http://sourcebench.com/languages/bash/simple-para…