1

Scenario: I'm having about hundred odd text files which are compressed as .gz. They need to be transferred to a remote server. I do not want to use scp. Both, linux servers.

my requirement/my idea: I want to write and execute a bash script on source server for this. What I want is like zcat each of the files, pipe that output over ssh user@remote(its already configured passwordless), then have those outputs redirected into a text file on the remote server. Is this way correct? If so how to implement this?

Thanks,

user492160
  • 139
  • 1
  • 3
  • 11

1 Answers1

2

You can simply do it like this:

for file in $(ls dir/); do zcat ${file} | ssh user@remote "cat > /path/${file}"; done
pkhamre
  • 5,900
  • 3
  • 15
  • 27
  • great. It works but the multiple files zcatted on source server and being output to a single file on remote server because each file on source server is on average 500M and the output file on remote server is growing >2G. why would that be? – user492160 Oct 09 '12 at 07:23
  • because of this "ssh user@remote "cat > /path/${file}", the source is .gz and on the remote host it's "cat" (not compressed). – chocripple Oct 09 '12 at 07:29
  • Yeah, I think I would just hack it with some `tr -d` and adding the `.gz` extension on the source files. – pkhamre Oct 09 '12 at 08:10
  • @pkhamre, please disregard my previous comment. Your original script works just fine redirecting to multiple files, not single file. Each gz file itself was about 3G when uncompressed, which I found out later. Thanks. – user492160 Oct 09 '12 at 08:15
  • 1
    just add ssh user@remote "cat | gzip > /path/${file}" – chocripple Oct 09 '12 at 08:29