How can I speed up a selective deep copy command in unix?

3

1

I am (ab?)using the unix 'find' command to recursively export files from a directory tree that is littered with .svn directories. This command takes a long time to run. Is there a faster way to accomplish the same thing?

find source/ -type f -and ! -path '*.svn*' -and -exec cp {} export \;

What I want to do is search every subdirectory in the directory tree starting at source/, and copy any files in those directories to the export directory, skipping any files that are in the .svn directories


Update: (complete solution, based on the answer posted by Michał Šrajer):

find source/ -type f -and ! -path '*.svn*' -and -print0 | xargs -0 \
    cp --target-directory=export

For those who are curious: xargs on Wikipedia.

e.James

Posted 2011-10-12T19:09:00.460

Reputation: 788

Answers

2

If you cannot use rsync as jhcaiced answered (+1), you can do it in several ways:

  1. copy directory with one cp -r and then remove all .svn dirs using find . -name .svn -delete`
  2. pipe one tar c to another tar x. Tar has --exclude=

You can also optimize your code. Note that you call cp for each file. This takes long time. You can use -print0 and xargs -0. This will be much faster.

Michał Šrajer

Posted 2011-10-12T19:09:00.460

Reputation: 2 495

Your suggestion to use -print0 and xargs -0 worked beautifully. Thank you! – e.James – 2011-10-12T20:57:14.550

@e.James: you welcome – Michał Šrajer – 2011-10-12T21:02:43.933

2

You can avoid copying the same files over and over using rsync, the command would be something like:

rsync source/ export/ --exclude '*.svn'

If you need to also remove from export the files removed from source/ then add the --delete parameter to the command.

jhcaiced

Posted 2011-10-12T19:09:00.460

Reputation: 1 488

+1, but will that preserve the directory structure? I'm trying not to preserve that structure. i.e. export should contain no subdirectories, just a big list of files – e.James – 2011-10-12T19:57:39.413