2

I am tailing a files output and grepping for lines with certain data. I don't want to output the data to the screen but instead count the number of instances it found and send that to the screen. The number of instances can be scrolling and incrementing or it can overwrite existing and only show it as it increments higher. That part is not really important I just need a running count of instances found.

My command right now is

tail -f logfile | grep 'data I want'

I have tried using grep -c and wc -l but nothing have given me the results I am after. This particular Linux distro does not have pv and will not be able to get it. Is there a way I can do this?

user53029
  • 619
  • 2
  • 14
  • 34
  • You could compile pv from source - that's what I've done on systems that haven't had it. You don't even have to install it - you can just grab the binary after running `make`. – sa289 Jul 23 '15 at 18:34
  • Its a locked down system. We don't have rights to install or compile. But thanks anyway. – user53029 Jul 23 '15 at 18:39

2 Answers2

12

GNU awk can do this fairly easily.

Rolling output:

tail -f logfile | grep 'stuff to grep for' | awk '{++i;print i}'

You can also leave out the grep and use awk's regular expressions instead:

tail -f logfile | awk '/stuff to grep for/ {++i;print i}'

For a single line output you can prepend a CR make it start at the front of the line again (works on a console):

tail -f logfile | awk '/stuff to grep for/ {++i;printf "\r%d",i}'
David
  • 606
  • 4
  • 6
  • \r worked in PuTTY for me as well - that's a great trick to know about! – sa289 Jul 23 '15 at 19:44
  • Should have mentioned this is not a console session, its a terminal (ssh). Would this make a difference? – user53029 Jul 23 '15 at 20:05
  • Ok can confirm the second option above works. Testing the 3rd command now. – user53029 Jul 23 '15 at 20:12
  • 3rd option also works like a charm. Thanks so much! – user53029 Jul 23 '15 at 21:57
  • Note for future readers: you may have to use `gawk` and/or add `system("")` command to flush the output when tailing (plain awk seems to buffer it for me in Debian 8). – yozh Nov 08 '18 at 10:23
  • Right; or `-W interactive` for some versions of `mawk`. Similarly, you may want to run your `grep` with `--line-buffered` when tailing a file / streaming short lines. – Jedi May 27 '20 at 01:52
1

There is always the trusty watch option:

 watch -d grep -c "string" /path/to/file 

which is from ideal when your file size is in excess of a a couple of 100 MB's.

Thanks for @sa289's suggestion to use an intermediate file:

 tail -f /path/to/file |grep "string" > /tmp/intermediate-file &
 watch -d grep -c "string" /tmp/intermediate-file
HBruijn
  • 72,524
  • 21
  • 127
  • 192
  • 1
    As a variant of what you posted which would work better with large files, the `tail -f logfile | grep 'data I want'` could be redirected to a file such as `tail -f logfile | grep 'data I want' > grepped_output` and then the `watch` command do a `wc -l` on that file. – sa289 Jul 23 '15 at 18:33
  • This method just gave me output of file not found etc.. going down the screen. Same thing happened when I redirected to stdout and watched. The suggestions above are working, so I think I can work it from there. Thanks for the help! – user53029 Jul 23 '15 at 20:15