This is a common situation: you want to include command 'foo' as part of a pipeline, but unfortunately command 'foo' only accepts actual filenames for I/O and does not read/write from stdin/stdout. I know that there's an Unix command which acts as a wrapper for misbehaved commands such as 'foo', but I can't remember its name. What is it?
Asked
Active
Viewed 917 times
2
-
Use a named pipe. – Zoredache Apr 18 '12 at 17:08
2 Answers
5
Assuming foo
is using -i
for its input file and -o
for its output one, this should convert it to a program suitable for a pipeline:
previousCommand | foo -i <(cat) -o >(cat) | nextCommand
This is called process substitution and, although not being standard, is available at least with both ksh and bash.
In simple cases like the previous example, the pipeline can be reduced like this:
foo -i <(previousCommand) -o >(nextCommand)
jlliagre
- 8,691
- 16
- 36
-
1By the way, this is a non-standard shell feature. It works in Bash for example, but not in dash nor busybox. – Lekensteyn Apr 18 '12 at 16:19
-
Thanks. But I could've sworn there is already a simple command out there that does the same in a portable way. Any ideas? – Jon Smark Apr 18 '12 at 16:42
-
@Lekensteyn thanks for pointing that out. Answer updated. Jon, there is no portable (POSIX) way outside using mkfifo, but the latter breaks the pipeline structure. – jlliagre Apr 18 '12 at 21:45
0
I think you want mkfifo, which can create named pipes which function like files.
Example:
mkfifo mypipe
perl mypipe & # awaits input from pipe file
echo "print 55;" > mypipe
ls -l mypipe # should be prefixed with a p denoting the inode is a pipe:
prw-r--r-- 1 user1 user1 0 Apr 18 12:10 mypipe
hurfdurf
- 933
- 7
- 11