For directories I'd use a tar
piped to bzip2
with max-compression.
a simple way to go is,
tar cfj archive.tar.bz2 dir-to-be-archived/
This works great if you don't intend to fetch small sets of files out of the archive
and are just planning to extract the whole thing whenever/wherever required.
Yet, if you do want to get a small set of files out, its not too bad.
I prefer to call such archives filename.tar.bz2
and extract with the 'xfj
' option.
The max-compression pipe looks like this,
tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
# ^pipe tarball from here to zip-in^ into the archive file.
Note: the 'bzip2
' method and more compression tends to be slower than regular gzip
from 'tar cfz
'.
If you have a fast network and the archive is going to be placed on a different machine,
you can speed up with a pipe across the network (effectively using two machines together).
tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
# ^ pipe tarball over network to zip ^ and archive on remote machine.
Some references,
- Linux Journal: Compression Tools Compared, Jul 28, 2005
- gzip vs. bzip2, Aug 26, 2003
- A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005
@DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view? – Pacerier – 2015-05-16T22:33:09.920
You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion... – Daniel Beck – 2011-06-19T05:28:51.487
1Depends how much compression you want. – ta.speot.is – 2011-06-19T06:49:15.807
1I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks! – Spike – 2011-06-20T03:16:21.770