Use all CPUs when creating archives on Linux

3

Today I wanted to create a backup from a folder on a server with 16 CPUs. I started looking for an option that can utilize my hardware to archive a folder better than how tar does it. Something with multi-thread support probably. I did some research and I figured there are tools like pbzip2 and pigz but they can only compress, not archive. So do you guys have an elegant solution for that?

Navid

Posted 2012-07-19T05:31:45.973

Reputation: 131

Answers

1

Archiving itself is I/O-intensive, and won't benefit from multiple cores. Feed the uncompressed output of tar to one of the programs you found.

Edit:

tar -cO /directory/path | whizbang -compress --ultra-brute --cpus=16

where whizbang is replaced with your favorite compressor depending on speed vs size preferences

Ignacio Vazquez-Abrams

Posted 2012-07-19T05:31:45.973

Reputation: 100 516

1

What Ignacio said... but -O is for extracting files to stdout...

So, my suggestion:

tar cf - /directory/path | whizbang -compress --cpus=16 > archive.tar.whizbang

The next step, of course, would be badgering the maintainers of tar to include whizbang support in their next release. ;-)

DevSolar

Posted 2012-07-19T05:31:45.973

Reputation: 3 860