0
I'm trying to copy files with s3cmd
to a bucket with an existing folder structure. Both structures already exist both on local dev machine and in the bucket.
The folders structure is as follows /dir2/
dir2/
00001/
00002/
00003/
...
Problem is when I'm trying to copy the info from my local machine to the bucket as follows
s3cmd put --acl-public --recursive --verbose /home/user/dir1/dir2/ s3://my.bucket/assets/dir1/dir2/
I'm getting the following output.
INFO: Compiling list of local files...
INFO: Applying --exclude/--include
INFO: Summary: 0 local files to upload
Looks like since the folders already exists on the bucket, s3cmd avoid copying the files from local machine beside the fact that they're not on the bucket (just the folders and other, different named files).
Any idea how to copy the files even when there's an existing sub-folder structure inside?
Could you just pipe
find
output tos3cmd
? – Christopher – 2012-07-27T09:48:37.750