As in previous answers (+1 for both) the trick is to use -type f
predicate.
Note, that instead of -exec rm '{}'
you can also use -delete
predicate. But don't do that. With -exec rm '{}'
you can (and should) first do -exec echo rm '{}'
to verify that this is really what do you want. After that rerun the command without the echo
.
Using -delete
is faster (no extra fork()
and execve()
for each file), but this is risky because -delete
works also as a condition, so:
# delete *.tmp files
find . -type f -name '*.tmp' -delete
but if you ONLY swap arguments:
# delete ALL files
find . -type f -name '*.tmp' -delete
If you ever need find
and rm
to work faster for tons of files, check out the find ... | xargs ... rm
UNIX idiom.
-mtime
doesn't seem to work on windows with git bash, but works fine on unix so that's ok :) – GabLeRoux – 2014-10-17T13:30:37.013I need to delete all files and subfolder inside a directory, in cron. I use this command, but it delete also the first directory. Ex. I wanto to remove all the files inside /var/download/, when I run the command it removes the "download" directory too. How to avoid this? – Kreker – 2015-06-05T09:25:30.187
@Kreker That shouldn't happen. If you use
-type f
it will only delete files, not directories.find /var/download -type f -mtime +10
should be enough. – slhck – 2015-06-05T09:29:41.710@slhck in the "download" dir there are directories too. I need to empty the whole "download" directory, I used the command without -type, but the first result is the directory itself. – Kreker – 2015-06-05T12:17:33.203
1@Kreker Then you need to add
-mindepth 1
to exclude the parent directory. – slhck – 2015-06-05T12:22:47.420@slhck yeah! works like a charm thank you very much! +1 +1 – Kreker – 2015-06-05T12:38:53.577
Any idea why this wouldn't work inside a shell script? I always get an unknown predicate `-delete error – scsimon – 2017-05-03T14:26:44.423
@scsimon Make sure you are using GNU
find
, and that your shell script is not using a stripped down version offind
that does not support it. – slhck – 2017-05-03T14:43:58.567Well there are a lot of folders within /path/to/files/ actually I would like to run the mtime +10 -exec rm to each of those folders but keep the folders itself, basically I would like to achieve is that delete all the files older than 10days and keep all the folders. I'm a newb sorry bout that newbness :) – JoyIan Yee-Hernandez – 2012-01-05T16:26:45.593
Well, there you go :) Just do a
find /path/to/files* -type f -mtime +10
and see what it would output. – slhck – 2012-01-05T16:28:22.523Yup, when in doubt, don't do the
-exec
, just see what it finds first. – Rob – 2012-01-05T17:19:53.7272@JoyIanYee-Hernandez You can also use the
-delete
argument tofind
, which deletes all files and folders, the latter only if empty. It implies-depth
, i.e. depth-first search. – Daniel Beck – 2012-01-05T17:40:31.837This gives
argument list too long
error if there are too many files. Could someone provide an iterative version? – cen – 2014-05-20T23:37:33.423@cen This only happens if the path you give to
find
is expanded, which is only the case when there is a glob in it (e.g.*
). Make sure you supply only a real directory as an argument. I'll fix my answer. – slhck – 2014-05-21T05:11:12.573