0

I'm familiar with how I would tackle this with PHP, however I'd like to get some more practice with bash scripting.

The task is to delete all files in a folder, which itself contains subfolders (with files). The files would typically be .pdf (or some variant: PDF, Pdf, pDf, ect), however there may occasionally be other file types including extensions unknown to me at this time.

Here's what I have so far. It echoes the filename, but if I issue rm $i, the system returns file not found on each file.

for i in `ls -bRC1 /foo/temp_folders/* ` ; do echo $i ; rm $i ; done

How would I force the absolute path when issuing rm $i?

a coder
  • 719
  • 4
  • 20
  • 37

2 Answers2

5

Per Zoredache... why not:
find /foo/temp_folders/ -type f -iname * -exec rm {} +

Edit: changed the trailing \; to + for performance as noted here

gharper
  • 5,365
  • 4
  • 28
  • 34
  • I'd uprate you, but I don't have the mojo just yet - thanks for the response. – a coder May 09 '12 at 20:16
  • 3
    I wouldnt go with -exec rm {} + but rather simply with -delete -> it's much faster than constantly calling Linux's exec. The other day I had to delete tons of files and tried both, -delete option came out as absolute winner – milosgajdos May 09 '12 at 20:50
  • Nifty, I didn't realize find even had a -delete option... I'll have to remember that one. – gharper May 10 '12 at 16:35
1

find -exec would be slow on large number of files. I would suggest:

find /foo/temp_folders -type f -print0 | xargs -0 rm
Bittrance
  • 2,970
  • 2
  • 21
  • 27
  • I'd uprate you, but I don't have the mojo just yet - thanks for the response. – a coder May 09 '12 at 20:12
  • Actually, this is a good point, and not one I usually have to consider. I found an interesting article that compares "-exec {} \;" to "-exec {} +" to "| xargs" here: http://fahdshariff.blogspot.com/2009/05/find-exec-vs-xargs.html – gharper May 09 '12 at 20:45