Fastest way to transfer a lot of small files on server (without git)

2

There are a lot of instances where I need to deploy a large amount of files on a server which in themselves are not that large as storage space goes, but SFTP and FTP transfer makes the process really slow.

Is there a faster way (a faster protocol) to transfer thousands of small files (1-30K)?

I am currently using compress->transfer->uncompress, but that's an overhead I'd like to avoid.

I have shell access to the server with a limited instruction set.

brett

Posted 2018-02-28T08:51:19.333

Reputation: 123

Answers

2

rsync is pretty efficient with lots of small files:

$ rsync -a path/to/local/files/ server:path/to/remote/files/

Paul R

Posted 2018-02-28T08:51:19.333

Reputation: 4 717

1it's faster than ftp indeed. thanks – brett – 2019-06-18T11:12:57.483

-1

cd path/to/local/files/ - where you want to copy your files.

sftp server:path/to remote/files/ - from where you want to copy your files.

then to copy files use get -r file_name , in case if gives get: Invalid flag -r error, then use R instead of r.

user883057

Posted 2018-02-28T08:51:19.333

Reputation: 1

Welcome to Super User. The question asks for the fastest way to transfer a lot of files. Your solution uses file_name as a parameter, which implies executing this manually for each file, perhaps twice re: invalid flag error. That doesn't seem like it could be the fastest solution. – fixer1234 – 2018-03-16T07:05:35.993