0

I am trying out CloudBerry Drive tool to attach S3 buckets as my network drive. I have a bucket and 2 folders inside that, assuming the bucket name as environment and 2 folders as dev and prod. I have 3 sets of users who would use this - Admin, Dev and Prod. Admin - Must have R/W permissions to both the folders. Dev - Should have Write access only to dev folder Prod - Should have Read access to dev folder and Write access to prod folder.

However I am confused with IAM permissions. Admin works fine with the S3 Full Access Permissions. But the other 2 accounts are not working fine.

Dev

{
"Version": "2012-10-17",
"Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
            "s3:PutObject",
            "s3:DeleteObjectVersion",
            "s3:RestoreObject",
            "s3:PutObjectVersionTagging",
            "s3:PutObjectTagging",
            "s3:DeleteObject"
        ],
        "Resource": "arn:aws:s3:::environment/dev/*"
    }
  ]
}

With this I should be able to R/W to the dev folder, which I am unable to do. Using CloudBerry Drive I can copy paste a file, but not getting uploaded to S3. Also, the files in S3 are listed here (which is required) but can't access the same (Error attached).

Prod

{
"Version": "2012-10-17",
"Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
            "s3:DeleteObjectTagging",
            "s3:DeleteObjectVersion",
            "s3:GetObjectVersionTagging",
            "s3:ReplicateTags",
            "s3:RestoreObject",
            "s3:PutObjectVersionTagging",
            "s3:DeleteObjectVersionTagging",
            "s3:ListMultipartUploadParts",
            "s3:ReplicateObject",
            "s3:GetObjectVersionTorrent",
            "s3:PutObject",
            "s3:GetObjectAcl",
            "s3:GetObject",
            "s3:ObjectOwnerOverrideToBucketOwner",
            "s3:GetObjectTorrent",
            "s3:AbortMultipartUpload",
            "s3:GetObjectVersionAcl",
            "s3:GetObjectTagging",
            "s3:PutObjectTagging",
            "s3:GetObjectVersionForReplication",
            "s3:DeleteObject",
            "s3:ReplicateDelete",
            "s3:GetObjectVersion"
        ],
        "Resource": "arn:aws:s3:::environment/Prod"
    }
  ]
}

With this I should be able to R/W to the prod folder, which I am unable to do. Using CloudBerry Drive I can copy paste a file, but not getting uploaded to S3.

{
"Version": "2012-10-17",
"Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
            "s3:GetObjectVersionTorrent",
            "s3:GetObjectAcl",
            "s3:GetObject",
            "s3:GetObjectTorrent",
            "s3:GetObjectVersionTagging",
            "s3:GetObjectVersionAcl",
            "s3:GetObjectTagging",
            "s3:GetObjectVersionForReplication",
            "s3:GetObjectVersion",
            "s3:ListMultipartUploadParts"
        ],
        "Resource": [
            "arn:aws:s3:::environment"
        ],
        "Condition": {
            "StringLike": {
                "s3:prefix": [
                    "dev/*"
                ]
            }
        }
    }
  ]
}

The files in S3 are listed here (which is required) but can't access the same (Error attached).

Hope I am clear about my requirement: 1. dev user to have R/W access to dev folder only 2. prod user to have R/W access to prod folder and R access to dev folder 3. All these are done using CloudBerry Drive for S3.

Note: I have googled a lot and tried almost everything, including some policies which shows how to provide write permission to a specific user folder. enter image description here

serverstackqns
  • 722
  • 2
  • 16
  • 39
  • 1
    S3 doesn't actually have folders. The console simulates them, but it's just based off `/` characters in your objects' keys. Why not use separate buckets for dev/prod? Much safer and easier. – ceejayoz Feb 14 '18 at 14:50
  • @ceejayoz: The customer requires that for some purpose. I tried login to AWS console and then upload/delete the content from the folders which should have access, but then as well the same. So, something is wrong with the permission, if AWS allows this sort of things. – serverstackqns Feb 14 '18 at 14:52
  • 2
    I don't think the way you're writing the IAM policies is valid. See this from AWS: https://aws.amazon.com/blogs/security/writing-iam-policies-grant-access-to-user-specific-folders-in-an-amazon-s3-bucket/ - you probably need the `"Condition":{"StringLike":{"s3:prefix":["home/David/*"]}}` sort of bits. – ceejayoz Feb 14 '18 at 14:54
  • @serverstackqns could you please answer your own question, to help others in future and so people don't come in and try to help you with your unanswered question. – Tim Dec 07 '18 at 17:55

1 Answers1

1

Apologies for writing a delayed answer: But in my script, change the condition in this way:

"Condition":{"StringLike":{"s3:prefix":["home/David/*"]}}

Thanks to ceejayoz, for pointing me this error.

serverstackqns
  • 722
  • 2
  • 16
  • 39