How can I copy multiple files over scp in one command?

20

4

I have to transport a lot of files from one PC to another (both Linux). I would like to use scp for that, but scp only allows for transferring one file at a time.

How can I do this?

I have

  • No possibility to use rsync or any other protocol
  • No possibility to use passphrase-free certificates (but have a certificate with a passphrase)
  • A list of files to transfer and a list with the destination path of the files on the other server
  • The files are spread out over a lot of directories, and not all the files in the directories I want to copy

If possible, I would like to gzip and ungzip transparently to save bandwidth!

Is this possible?

Peter Smit

Posted 2010-03-04T07:48:38.520

Reputation: 7 906

Answers

31

Use tar:

tar cvzf - -T list_of_filenames | ssh hostname tar xzf -

akira

Posted 2010-03-04T07:48:38.520

Reputation: 52 754

2Heres the file format: "In c mode, tar will read names to be archived from filename. The special name -C'' on a line by itself will cause the current directory to be changed to the directory specified on the following line. Names are terminated by newlines unless --null is specified. Note that --null also disables the special handling of lines containing-C''." – Kousha – 2010-03-04T10:46:16.763

15

I would like to use scp for that, but scp only allows for transferring one file at a time.

I'm pretty sure that isn't true, at least not for the scp command provided by the OpenSSH included with most Linux distributions.

I use scp file1 file2 ... fileN user@host:/destination/directory/ fairly frequently.

For transparent compression, the SSH protocol has this built in and scp can use it if you provide the -C option on the command line. For lots of similar small files you will find the tar+gz option suggested by akira gains better compression as it can make use of similarity between such files where scp compresses each file as a separate entity. I generally prefer to use scp though as it is easier to resume a partial transfer (or, though I know in this situation the questioner doesn't have the option) rsync as that is both even easier to resume and shares the tar+gz option's whole-stream compression advantage.

David Spillett

Posted 2010-03-04T07:48:38.520

Reputation: 22 424

This is the right answer, but it really only transfers one file at a time. Nevertheless it will make full use of available bandwidth and should run as fast as possible. – Benjamin Bannier – 2011-04-28T19:01:29.593

4

Doesn't scp -r yourdir otherhost:/otherdir work?

Try this then:

tar cfz - . | ssh otherhost "cd /mydir; tar xvzf -"

the z-flag to tar does compression. Or you can use -C to ssh:

tar cf - . | ssh -C otherhost "cd /mydir; tar xvf -"

Jimmy Hedman

Posted 2010-03-04T07:48:38.520

Reputation: 886

what does -C do? – Nathan Fellman – 2010-03-04T08:53:56.300

1The problem is, I have a list of files. The files are spread out over a lot of directories, and not all the files in the directories I want to copy – Peter Smit – 2010-03-04T09:34:22.003

2@Nathan: -C is to lett ssh do the compression. – Jimmy Hedman – 2010-03-04T11:32:59.740

1

On RHEL5, here is the only way possible that I know of to secure copy a list of files that contain special characters (like a space). Make a shell script and include the following:

FILE="/path/filename"
while read line; do
scp "$line" username@servername:/destination/
done < $FILE

Brian Thomas

Posted 2010-03-04T07:48:38.520

Reputation: 11

0

It can be done w/o much trouble.

I simply create a list of files with the hostname next to each file and scp seems to parse it fine. ie.

srchost=`hostname`
myfiles="${srchost}:~/.bashrc ${srchost}:~/.bash_profile ${srchost}:~/.vimrc ${srchost}:~/.viminfo ${srchost}:~/.toprc ${srchost}:~/.dir_colors"
scp -v $myfiles $HOME

Scp seems to work fine with this syntax. As I attempted it without the hostname next to each file it failed after the first file. I seem to recall another syntax I used for this before and will post it once I locate it.

Ernie M.

Posted 2010-03-04T07:48:38.520

Reputation: 1

This works, but is quite slow - it seems to start a connection for each file... (I came here looking for a solution for the slow copy...)

~/ can mostly be left out - SCP operates relative to the home directory...

If you have a list from elsewhere in $list: scp $(echo "$list"|sed "s/^/$srchost:/") local_destination (This will break for filenames with spaces though) – Gert van den Berg – 2016-10-06T08:50:48.127

-1

I would recommend to parts use a command like this to create an archive from your file list

tar -cvf allfiles.tar -L mylist.txt

Then use scp to download the archive.

scp user@remotehost:~/allfiles.tar

https://stackoverflow.com/questions/8033857/tar-archiving-that-takes-input-from-a-list-of-files

Justin Kubicek

Posted 2010-03-04T07:48:38.520

Reputation: 1