2

tee forwards its stdin to every single file specified, while pee does the same, but for pipes. These programs send every single line of their stdin to each and every file/pipe specified.

However, I was looking for a way to "load balance" the stdin to different pipes, so one line is sent to the first pipe, another line to the second, etc. It would also be nice if the stdout of the pipes are collected into one stream as well.

The use case is simple parallelization of CPU intensive processes that work on a line-by-line basis. I was doing a sed on a 14GB file, and it could have run much faster if I could use multiple sed processes. The command was like this:

pv infile | sed 's/something//' > outfile

To parallelize, the best would be if GNU parallel would support this functionality like so (made up the --demux-stdin option):

pv infile | parallel -u -j4 --demux-stdin "sed 's/something//'" > outfile

However, there's no option like this and parallel always uses its stdin as arguments for the command it invokes, like xargs. So I tried this, but it's hopelessly slow, and it's clear why:

pv infile | parallel -u -j4 "echo {} | sed 's/something//'" > outfile

I just wanted to know if there's any other way to do this (short of coding it up myself). If there was a "load-balancing" tee (let's call it lee), I could do this:

pv infile | lee >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile)

Not pretty, so I'd definitely prefer something like the made up parallel version, but this would work too.

ehsanul
  • 427
  • 1
  • 8
  • 19
  • It sounds like what you're asking for may not be possible, if only because it has a high chance of not lining up the output. If this is a contiguous 14GB file, pipe-lining it while maintaining order wouldn't work. Is this the case? – Andrew M. Jan 14 '11 at 23:31
  • Well, I don't actually care about maintaining order in the file in this case. However, I'm sure it's important for other tasks. If the program which divides the stdin also recombines the stdout, as I hope for, it can do a line-by-line recombine, which would maintain order just fine I think. – ehsanul Jan 14 '11 at 23:40

2 Answers2

3

We are discussing how to implement exactly this feature on the mailinglist for GNU Parallel right now http://lists.gnu.org/archive/html/parallel/2011-01/msg00001.html

Feel free to join: http://lists.gnu.org/mailman/listinfo/parallel

A prototype is now ready for testing: http://lists.gnu.org/archive/html/parallel/2011-01/msg00015.html

Ole Tange
  • 2,836
  • 5
  • 29
  • 45
0

I'd look at implementing this in Perl with Parallel::ForkManager. You could do the line splitting in the script and then feed the resulting lines into Parallel::ForkManager processes. The use the run_on_finish callback to collect the output. Obviously for your sed example you could just do the text operation in perl instead and maybe use something like AnyEvent to handle parallelism.

Phil Hollenback
  • 14,647
  • 4
  • 34
  • 51