stdbuf not working while unbuffer does

6

1

I am trying to apply timestamps to stdout of a process. For the proper timestamps to be applied, I attempt to unbuffer stdout of the process. This works with unbuffer but not with stdbuf as I would expect. Consider the following slow printing script 'slowprint':

#!/bin/bash

if [ $# -ne 2 ]; then
   echo "usage: ${0%%/*} <file> <delay in microseconds>"
   exit 1
fi

DELAY=$2 perl -pe 'BEGIN{use Time::HiRes qw(usleep)} { usleep($ENV{DELAY}) }' $

now compare the following attempts to apply timestamps:

stdbuf -oL ./slowprint <(ls) 100000 | 
awk '{ print strftime("%H:%M:%S"), $0; fflush(); }'

vs

unbuffer ./slowprint <(ls) 100000 | 
awk '{ print strftime("%H:%M:%S"), $0; fflush(); }' 

The second one works for me while the first one doesn't, though I expect them to do the same thing. Currently unbuffer is unsuitable because it swallows error codes in certain circumstances, (I posted a separate question about that behavior).

frankc

Posted 2014-04-15T19:03:47.807

Reputation: 261

Necroed: perl allows scripts to do pretty lowlevel I/O and I'd guess (but am not certain) that affects buffering. You can override it here by setting $|=1 or with 'English' in effect $OUTPUT_AUTOFLUSH=1. This won't work for non-perl of course, but you may not have the problem for non-perl. – dave_thompson_085 – 2016-09-09T05:53:10.557

Answers

1

Try annotate-output. It provides timestamps for STDIN, STDOUT, and STDERR.

An example, use wc to do a line count of a bash process substitution, (one line), and a nonexistent file:

annotate-output wc -l <(echo foo) nosuchfile

Output:

10:17:45 I: Started wc -l /dev/fd/63 nosuchfile
10:17:45 O:       1 /dev/fd/63
10:17:45 E: wc: nosuchfile: No such file or directory
10:17:45 O:       1 total
10:17:45 I: Finished with exitcode 1

agc

Posted 2014-04-15T19:03:47.807

Reputation: 587