1

I have a Linode (Ubuntu 16.04) from which I am trying to copy files to an AWS S3 bucket via a cron job calling a script, but the log outputs: upload failed: ../path/file.ext to s3://bucket/prefix/file.ext Unable to locate credentials

The script tars a directory, and then uploads that tar to my s3 bucket

  • The script works start to finish if I call it directly via sudo
  • As a root cron job, the tar works, but the aws upload doesn't (with error noted above)
  • As a [user] cron job, the tar fails (intentional permissions related), but the aws upload succeeds.
  • When I installed AWS CLI, I forget exactly how it was worded, but I chose to have it installed for all users

Things I've Tried

  1. Having my script call aws directly at /usr/local/bin/aws
  2. Adding /usr/local/bin/aws to the PATH in crontab, and also in my script
  3. Adding AWS_CONFIG_FILE="/home/[user]/.aws/config" in crontab, and also in my script
  4. Re-running aws configure as root
  5. Following this tip, and comparing cron and interactive environments. My env.cron PATH includes everything listed in my env.interactive PATH, plus a few more, even some duplicates - is that bad?

There are many more statements in my env.interactive (1810 lines) compared to my env.cron (36 lines). It must be something in my environment differences, right? I've searched my env.interactive for any instance of aws, but there is none, even though that env works just fine. Any tips on other specific items to look for in there?

Any ideas and help are appreciated! Thanks!

apex
  • 111
  • 1
  • 4

3 Answers3

1

If you want to run a specific command using a user with sudo, and have config read from its home directory rather than yours, then you have run it with sudo -H -u user ... for sudo to update the HOME variable to the called user automatically.

In this case that will imply a valid value for AWS_CONFIG_FILE and AWS_CREDENTIALS will be automatically generated.

Josip Rodin
  • 1,575
  • 11
  • 17
0

I copied the .aws directory from /home/[user/ to /root/, and that seemed to solve it.

Are there any security/other concerns surrounding the config files being duplicated to the root folder?

apex
  • 111
  • 1
  • 4
  • Regarding *#4* of **Things I've Tried** - I had tried `sudo aws configure`, but I hadn't actually run the command as root. I found out about the command `aws configure list` to show current config, and running as [user] or sudo executing showed proper values. But switching to root user showed as not configured. Apparently `aws configure` needs to be run as each user intended to execute, and subsequently re-run if credentials ever need to be updated? – apex Jan 29 '17 at 01:29
0

Here the problem is to upload files to s3 we need to set up credentials for the bash script. I faced the same issue.

I ran my code as bash and it worked. But when I added the bash script to run as a cron the exact issue came with credentials.

So I solved it by giving the credentials on my bash script as when running a cron it cannot take the credentials from the aws configure. So adding the credentials in that would not fix your issue. The following code you can see as an example of how I did it. It worked for me.

export AWS_CONFIG_FILE="/root/.aws/config" #change it as per user
export AWS_ACCESS_KEY_ID=XXXXXXXXXXXXXXX
export AWS_SECRET_ACCESS_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Refer this link for more information.

But adding aws credentials on script is not recommended. So you can use what ever the user that you are using to config the AWS credentials. Log in with that user as

sudo su - {user}

Then use aws configure and config your AWS credentials adding the access key id, secret, region ..etc. When you try to config as root only the root can use that credentials. So you need root to run the s3 copy.