copy specific files from s3 bucket

1

1

I have some files in s3 bucket.

aws s3 ls s3://bucketname/

file-111-100x100.jpg
file-112-1400x1400.jpg
file-123-250x250.jpg
file-231-1400x1400.jpg
file-222-700x700.jpg
file-333-100x100.jpg
file-131-1400x1400.jpg
file-321-250x250.jpg
file-232-480x480.jpg
file-113-1400x1400.jpg
file-331-100x100.jpg

How to copy only files through aws command line which have 1400*1400 text like

file-112-1400x1400.jpg
file-231-1400x1400.jpg
file-131-1400x1400.jpg
file-113-1400x1400.jpg

Help me to do this.

Thank you.

Jaimin

Posted 2018-04-18T11:49:21.540

Reputation: 53

Answers

1

I'm not sure if awscli has a built-in way to do this, but I've always just used simple bash to do things like this. For example:

for i in $(aws s3 ls s3://bucketname/ | grep 1400x1400); do aws s3 cp s3://bucketname/$i; done

It's not the prettiest way to do it, but it's pretty general and flexible.

Michael Pobega

Posted 2018-04-18T11:49:21.540

Reputation: 131

0

I believe the best way would be using the aws cli --include and --exclude filters:

aws s3 cp s3://bucketname/ [dest] --exclude "*" --include "*1400x1400*"

where [dest] is your destination (i.e. an s3 path or local path)

Two points:

  • If you only want to include certain files, you need to first --exclude "*" before including whatever pattern you specify. Also, the order of the filters matters.
  • You can also apply this filter recursively using the --recursive flag

Reference: https://docs.aws.amazon.com/cli/latest/reference/s3/index.html#use-of-exclude-and-include-filters More info on aws s3 cp: https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html

Lood van Niekerk

Posted 2018-04-18T11:49:21.540

Reputation: 1