5

Is there a command in Solaris to read a file, and when it gets to the end to stream the way tail does? I need to read the file from the start, and it is a binary file.

Information on Solaris and Linux would be appreciated.

700 Software
  • 2,163
  • 9
  • 47
  • 77
  • 5
    What are you trying to achieve? Piping a binary file through a text utility doesn't usually end well. – user9517 Jan 07 '12 at 15:27
  • When a file is being written, I want to process it using a binary reader such as gzip decompressor, but I don't want it to fail when it is done. Instead I want it to wait. And not end until I specifically finish it. I do not need the whole file, just about half way through. I want something like `cat -f | gzip -d | tee result` – 700 Software Jan 07 '12 at 15:40

2 Answers2

12

In linux you can use tail -f -n +0 /path/filename to see it. While -n generally refers to how many lines at the end of the file that you want printed, when passed +<n> it starts at the nth line from the beginning of the file.

From tail --help:

-n, --lines=K            output the last K lines, instead of the last 10;
                         or use -n +K to output lines starting with the Kth
altendky
  • 143
  • 7
Gea-Suan Lin
  • 636
  • 4
  • 6
5

tail -9999f will do something close to what you want. Add more 9s if your file is bigger.

Problems:

  1. Binary files may not have newline characters. tail -f will wait for a newline before printing anything out.
  2. The version of tail on Solaris (you didn't mention which Solaris but it probably doesn't matter) probably doesn't support that option. It may support tail -n 9999 -f. You may have to acquire the GNU version of tail.
  3. Because the file is constantly growing, there is a race condition between finding out how big it is and starting the tail process. You could miss the start of the file if you don't ask it to get enough lines.
  4. tail won't know when you have really finished writing to the file so your gzip process will never finish either. I'm not sure what will happen when you ctrl-c to end the tail process but it's likely that gzip will clean up after itself and remove the file it was working on.

My suggestion would be to start your original program up and pipe the output to gzip like this:

./my_program | gunzip > new_file.txt

That way, gunzip will wait if my_program is going slow but will still finish when the true end of the file is indicated by my_program finishing.

You may need to rewrite your program to write to STDOUT rather than directly to a file.

Edit:

After a look at the man page, three of the issues above can be resolved. Using the -c <bytes> option instead of -n <lines> mitigates problem 1. Using -n +0 or -c +0 mitigates problem 3. Using --pid=<PID> will make tail terminate when the original program ( running as <PID> ) terminates which mitigates problem 4.

Ladadadada
  • 25,847
  • 7
  • 57
  • 90