0

I am trying to move all my configuration files to S3 storage, and access them directly from Terraform while provisioning. However, I was not able to point to S3 object directly similar to giving a local file path.

So instead of;

content = templatefile("/home/user/config/inventory.tftpl")

I like to be able to use;

content = templatefile("s3://remote-state-storage-548/trio-eks-cluster/templates/inventory.tftpl")

Of course this doesn't work. So I had to use aws_s3_object data source. Then I tried passing the "body" property of this object directly to templatefile function, even after changing the content-type to text it failed. Because it expects a "path" not "content".

So as a last resort I copied the body of this data source into local file, then used that file.

But this defeats the whole purpose of getting rid of local configuration files, certificates, etc... So, what do you guys think? How can I accomplish this?

####### S3 JINJA2 INVENTORY TEMPLATE ######

data "aws_s3_object" "inventory" {
  bucket = "remote-state-storage-548"
  key    = "trio-eks-cluster/templates/inventory.tftpl"
}

# Workaround
resource "local_file" "foo" {
    content  = "${data.aws_s3_object.inventory.body}"
    filename = "inventory.tftpl"
}


##### INVENTORY CREATION FROM TEMPLATE FILE ######
resource "local_file" "inventory" {
  filename = "./ansible/inventory"
  content = templatefile("/home/user/config/inventory.tftpl",
    {
      do_droplets = "${module.digital_ocean.droplets[*]}"
      aws_bastion = "${module.aws.bastion[*]}"
      workers = "${module.aws.workers[*]}"
  })
    provisioner "local-exec" {
    working_dir = "/home/user/ansible/trio/"
    command = "ansible-playbook -i inventory trio-final.yml"
  }
  depends_on = [
    module.aws
  ]
}

0 Answers0