1

I’m creating some dynamodb tables via CloudFormation, which will be accessed by Python lambdas. It looks like I have two choices in naming the tables — let CF do it, and therefore put a whole bunch of random characters on the name, or specify the name explictly, which gives me an easy to access name but means I can’t make changes via CF without replacing the table.

I think the downside to not having a simple name is that I either have to have a config file that is different for every account that I run the CF in (because they will have different random names) or I have to write code to detect the name on each invocation of the lambda.

So what do y’all do? Explicit names and then deal with replacement for changes, or let CF do it and deal with DB name discovery in the code? Or something else I’m not thinking of?

jedberg
  • 2,291
  • 22
  • 21

1 Answers1

2

CloudFormation let's you create templates for your stack configurations so I would suggest treating them as such. You probably don't want to have hardcoded values in your templates.

There are few ways to make names dynamic, one of them - don't specify one and AWS will generate a unique one for you. Other - use !Sub or !ImportValue/!Ref intrinsic functions to build up dynamic values. I.e.: TableName: !Sub "${AWS::StackName}-my-unique-content" which will be always unique per stack, but also will contain some descriptive information about the contents inside.

If you have your Python lambdas inside the same stack, pass in the table name as an environmental variable (imo the easiest way)

PythonFunction:
  Type: "AWS::Serverless::Function"
  Properties:
    Environment:
      Variables:
        TABLE_NAME: !Ref DynamoTableResource

If they are created in another stack, but in the same account - your best chance is to export the table name as an Output, and then reference that by calling !ImportValue function. For example:

dynamostack.yaml
Outputs: 
  DynamoDBResource:
    Description: "DynamoDB table"
    Value: !Ref DynamoTableResource
    Export:
      Name: !Sub "${AWS::StackName}-exported-dynamo-table-name"

And in another stack:

functionstack.yaml
PythonFunction:
  Type: "AWS::Serverless::Function"
  Properties:
    Environment:
      Variables:
        TABLE_NAME: !ImportValue "dynamostack-exported-dynamo-table-name"

Be careful with inter-stack references though, you might end up with circular dependencies all over the place.

Another option is to utilise parameters in CloudFormation template and pass DynamoDB table names via them, and referencing them (by using !Ref) in your lambda functions.

donis
  • 121
  • 4