Delete all files older than X days

5

1

So I have a cron setup to make backups of a folder into a tarball every hour. I would like to add into the shell script that I'm using the ability to have files deleted automagically after about three days, so that I don't have a crap ton of files.

How can I go about this? Thanks.

Chiggins

Posted 2011-01-21T17:09:43.430

Reputation: 379

Answers

8

Add this line to the script (modify accordingly):

find /path/to/backup_folder -mtime +3 -exec rm {} \;

This assumes your backup tarballs and only your backup tarballs reside in that folder. You could also use the tmpwatch utility:

tmpwatch -mf /path/to/backup_folder 72

John T

Posted 2011-01-21T17:09:43.430

Reputation: 149 037

So if I just add this in (changing the folder path), it'll delete anything older than three days? – Chiggins – 2011-01-21T17:15:03.627

Yes, but keep in mind it is recursive. If you have any folders beneath the backup folder, add the -maxdepth 1 switch. – John T – 2011-01-21T17:17:07.860

If you are at all unsure, add "echo " before the rm and verify the output first. – Chris Nava – 2011-01-21T17:20:39.127

@Chris good suggestion! – John T – 2011-01-21T17:25:44.117

3

From my crontab on my mac:

0 13 * * * /usr/bin/find /Users/dharris/.Trash -atime +14 -mindepth 1 -maxdepth 1 -print0 | xargs -0 ls -ltd

Using -atime rather than -mtime means that if I access the file, it won't be deleted.

My version here uses +14 for two weeks delay, change to +3 for your needs.

Doug Harris

Posted 2011-01-21T17:09:43.430

Reputation: 23 578

+1 for atime use... I always do this too. If I looked at the file, it may be interesting enough to keep this around a little longer. – Rich Homolka – 2011-01-21T17:33:18.583