75

Using the pipes (|) feature in Linux I can forward chain the standard input to one or several output streams.

I can use tee to split the output to separate sub processes.

Is there a command to join two input streams?

How would I go about this? How does diff work?

Cristian Ciupitu
  • 6,226
  • 2
  • 41
  • 55

7 Answers7

120

Personally, my favorite (requires bash and other things that are standard on most Linux distributions)

The details can depend a lot on what the two things output and how you want to merge them ...

Contents of command1 and command2 after each other in the output:

cat <(command1) <(command2) > outputfile

Or if both commands output alternate versions of the same data that you want to see side-by side (I've used this with snmpwalk; numbers on one side and MIB names on the other):

paste <(command1) <(command2) > outputfile

Or if you want to compare the output of two similar commands (say a find on two different directories)

diff <(command1) <(command2) > outputfile

Or if they're ordered outputs of some sort, merge them:

sort -m <(command1) <(command2) > outputfile

Or run both commands at once (could scramble things a bit, though):

cat <(command1 & command2) > outputfile

The <() operator sets up a named pipe (or /dev/fd) for each command, piping the output of that command into the named pipe (or /dev/fd filehandle reference) and passes the name on the commandline. There's an equivalent with >(). You could do: command0 | tee >(command1) >(command2) >(command3) | command4 to simultaneously send the output of one command to 4 other commands, for instance.

freiheit
  • 14,334
  • 1
  • 46
  • 69
  • awesome! i've read bash's manpage lots of time but hadn't pick that one – Javier Aug 16 '10 at 21:12
  • 2
    You can find the reference in the [advanced bash scripting guide] (http://tldp.org/LDP/abs/html/process-sub.html) at the linux documentation project – brice Jul 08 '11 at 15:50
  • 3
    i was able to prevent interleaved lines by piping through `grep --line-buffered` - handy for concurrently `grep`'ing the `tail` of multiple log files. see http://stackoverflow.com/questions/10443704/line-buffered-cat – RubyTuesdayDONO Apr 08 '13 at 20:47
18

You can append two steams to another with cat, as gorilla shows.

You can also create a FIFO, direct the output of the commands to that, then read from the FIFO with whatever other program:

mkfifo ~/my_fifo
command1 > ~/my_fifo &
command2 > ~/my_fifo &
command3 < ~/my_fifo

Particularly useful for programs that will only write or read a file, or mixing programs that only output stdout/file with one that supports only the other.

Chris S
  • 77,337
  • 11
  • 120
  • 212
  • 2
    This one works on pfSense (FreeBSD) whereas the accepted answer does not. Thank you! – Nathan Stocks Jul 07 '16 at 15:05
  • How does this work in terms of preventing either files from overwriting each other's data? I'm looking for something that respects line buffering – smac89 Apr 18 '21 at 01:25
13
(tail -f /tmp/p1 & tail -f /tmp/p2 ) | cat > /tmp/output

/tmp/p1 and /tmp/p2 are your input pipes, while /tmp/output is the output.

Cristian Ciupitu
  • 6,226
  • 2
  • 41
  • 55
gorilla
  • 1,207
  • 9
  • 6
6

I have created special program for this: fdlinecombine

It reads multiple pipes (usually program outputs) and writes them to stdout linewise (you can also override the separator)

Vi.
  • 821
  • 11
  • 19
3

Be careful here; just catting them will end up mixing the results in ways you may not want: for instance, if they're log files you probably don't really want a line from one inserted halfway through a line from the other. If that's okay, then

tail -f /tmp/p1 /tmp/p2 > /tmp/output

will work. If that's not okay, then you're going to have to do find something that will do line buffering and only output complete lines. Syslog does this, but I'm not sure what else might.

EDIT: optimalization for unbuffered reading and named pipes:

considering /tmp/p1 , /tmp/p2 , /tmp/p3 as named pipes, created by "mkfifo /tmp/pN"

tail -q -f /tmp/p1 /tmp/p2 | awk '{print $0 > "/tmp/p3"; close("/tmp/p3"); fflush();}' &

now by this way, we can read the Output named pipe "/tmp/p3" unbuffered by :

tail -f /tmp/p3

there is small bug of sort, you need to "initialize" the 1st input pipe /tmp/p1 by:

echo -n > /tmp/p1

in order to tail will accept the input from 2nd pipe /tmp/p2 first and not wait until something comes to /tmp/p1 . this may not be the case, if you are sure, the /tmp/p1 will receive input first.

Also the -q option is needed in order to tail does not print garbage about filenames.

readyblue
  • 119
  • 1
  • 7
pjz
  • 10,497
  • 1
  • 31
  • 40
  • the more usefull will be: **"tail -q -f /tmp/p1 /tmp/p2 | another_command"** as it will be done line by line and with -q option it will not print any other garbage – readyblue Oct 22 '14 at 19:27
  • for unbuffered file/named pipe use: **`tail -q -f /tmp/p1 /tmp/p2 | awk '{print $0 > "/tmp/p3"; close("/tmp/p3"); fflush();}' &`** now the /tmp/p3 can be even named pipe and you can read it by simply **`tail -f /tmp/p3`** all this is **UNBUFFERED = line by line** there is however small bug of sort. the 1st file/named pipe needs to be initialized first in order tail will accept the output from the 2nd. so you will need to `echo -n > /tmp/p1` and than everything will work smoothly. – readyblue Oct 22 '14 at 20:47
3

A really cool command I have used for this is tpipe, you might need to compile because it not that common. Its really great for doing exactly what your talking about, and it's so clean I usually install it. The man page is located here http://linux.die.net/man/1/tpipe . The currently listed download is at this archive http://www.eurogaran.com/downloads/tpipe/ .

It's used like this,

## Reinject sub-pipeline stdout into standard output:
$ pipeline1 | tpipe "pipeline2" | pipeline3
J. M. Becker
  • 2,431
  • 1
  • 16
  • 21
1

The best program for doing this is lmerge. Unlike freihart's answer it's line-oriented so the output of the two commands won't clobber each other. Unlike other solutions it fairly merges the input so no command can dominate the output. For example:

$ lmerge <(yes foo) <(yes bar) | head -n 4

Gives output of:

foo
bar
foo
bar
reinierpost
  • 410
  • 3
  • 9
Rian Hunter
  • 61
  • 1
  • 1