5

We have our web servers running in AWS EC2. We have about 30 API Keys/passwords/etc (sensitive data) which is set in the environment for our app to use. Whenever we deploy, we start a new server instance, pull down our repo and build our application. I need a way to get the keys down to the server to put into the environment for the application. Since we are deploying multiple times a week, this can't be a manual process. I also don't want to put the keys in our repository. Even if we encrypted a file with all our keys, we will still need a key to decrypt it on the other side. I have a solution in place, but whenever we need to add/change keys it's not very straight forward.

Can anyone think of a good way for me to get these keys into the environment? Thanks!

trikosuave
  • 51
  • 1

4 Answers4

1

I know I'm late on this one, but might benefit someone else facing this problem now.

You could do the following: - Put a master key in the database - Checkin the actual key in code repo, but encrypted by the master key. This checked in key could be in a password protected file.

A few months ago AWS has come up with a new service called KMS (key management service). They take care of managing the actual key and its master key in a secure and compliant manner.

Fayez
  • 85
  • 1
  • 2
  • 8
0

If i understood everything, i think that you have two scenarios (or more!)

Scenario 1

You could create a script that keeps checking for the new (encrypted) keys. Whenever it spots a new one (detected through hashing?), it would download them and set everything up for the build. If made with python, it would be less than 30 lines, maybe even under 20.

Obs 1: The file containing the new key needs to be encrypted. You may use a symmetric one like AES and the secret may be kept with the agent.

Obs 2: The best method for the checking would be the web server generating the hash and just outputting it upon the agent requests.

Obs 3: The agent should not decrypt the key directly to disk. It may give you a little extra trouble, but it will be better if it only decrypted the key to memory and then made the changes in the necessary files.

Scenario 2

The agent checks the local disk instead of a web location, looking for the new key. Whenever it is available, this robot puts it in the right place and cleans the "checking" location, leaving it prepared for the next key update.

You could update this key via SCP/SFTP/OTHER_ENCRYPTED_OPTION and this uploading procedure could be automated as your needs may require.

In this method nothing needs to be encrypted, except the new key upload.

DarkLighting
  • 1,523
  • 11
  • 16
0

You may leave the keys on a dedicated system and provide cryptography services via the network. They key to access such cryptography services could be shared with developers as it could be used only from specific IPs. A famous Internet provider did this for SSL certificates (I forgot its name).

Enos D'Andrea
  • 1,047
  • 5
  • 12
0

Facing the same issue I ultimately decided to provide the crypto keys via an API, when an application asks for it.

This has several advantages and drawbacks:

  • you rely on some kind of containment / limitation. In my case it was IP filetring which was optimal for the architecture we were in.
  • there is the the risk of IP spoofing which needs to be weighted
  • the API also allowed me to provide a centralized configuration (sent together with the keys upon request)
WoJ
  • 8,957
  • 2
  • 32
  • 51