3

I have a few terabytes of data that are processed by a tool. The output of the tool should be sent to two other machines simultaneously (simultaneously, because there's downtime while the tool runs and I want to limit the downtime).

If I just pipe from one machine then it's easy:

tool terabyte.txt | ssh user@1.2.3.4 /sbin/process-input

but how can I send the data to multiple machines simultaneously?

I do not mind if the situation involves other software or scripts. it doesn't have to be a "pure ssh" solution.

user9517
  • 114,104
  • 20
  • 206
  • 289
cruppstahl
  • 165
  • 5

3 Answers3

4

You could try using tee

tool terabyte.txt | tee >(ssh user@1.2.3.5 /sbin/process-input) | ssh user@1.2.3.4 /sbin/process-input
user9517
  • 114,104
  • 20
  • 206
  • 289
3

You can try with ClutterSSH :

With ClusterSSH it is possible to make a SSH connection to multiple servers or machines to perform tasks from one single command window, without any scripting. The ‘cssh’ command lets you connect to any server specified as a command line argument, or to groups of servers (or cluster nodes) defined in a configuration file.

aleroot
  • 3,160
  • 5
  • 28
  • 37
3

Perhaps, pee from moreutils might be even better:

pee is like tee but for pipes. Each command is run and fed a copy of the standard input. The output of all commands is sent to stdout.

So,you can run

tool terabyte.txt |pee 'ssh user@1.2.3.4 /sbin/process-input' 'ssh user@1.2.3.5 /sbin/process-input'

minaev
  • 1,549
  • 12
  • 13