I have lots of ~1 GB files (database dump files, taken at regular intervals). Right now I'm just storing them all in one directory, each file gzipped. We're running out of disk space and want to continue to store the old ones. Ignoring the obvious solution of throwing money at the problem to buy more disks, is there any way to store these in a space effeciant manner?
Each file is a database dump file, taken every half hour, and hence there should be a lot of duplicate content. Is there some programme/process that'll make this easier. I don't want to try a new filesystem. I am playing around with git & git-repack, but that uses a lot of memory. Is there anything a bit simplier?