I am trying to copy all pictures and static files to a bucket of mine in Google Cloud Platform.
I am attempting this command from the root dir of my app:
find -regextype posix-extended -iregex ".*\.(js|css|png|jpg|gif|ttf|cur|woff|eot)" | gsutil -m cp -I gs://example-bucket/
And my files are in folders like this for example:
./pictures/bg/img.png
./pictures/pictures/dog.jpg
./fonts/modern.woff
The flag -I
in the gsutil command tells it to load the list of files from stdin, the flag -m
just makes a multi-thread upload.
This all works fine, I see my files in the bucket, however, all files lose their original paths and get sent to the root of the bucket, like this:
gs://example-bucket/img.png
gs://example-bucket/dog.jpg
gs://example-bucket/modern.woff
The wanted result is this:
gs://example-bucket/pictures/bg/img.png
gs://example-bucket/pictures/pictures/dog.jpg
gs://example-bucket/fonts/modern.woff
I would like the files to preserve their original paths.
I also tried this and I get the same result:
gsutil -m cp -r ./**/*.{js,css,png,jpg,gif,ttf,cur,woff,eot} gs://example-bucket/
The only thing that seems to be working is to make a for loop
for ..get-files..
begin
gsutil cp $i gs://example-bucket/$i
end
And also
find ..find-expr.. -exec gsutil cp {} gs://example-bucket/{}
But both of those are too slow for my workflow.
Thanks in advance for your help.