5

How do I go about writing scripts (for automation) in Powershell (and possibly even Python) that help me access APIs while keeping the credentialing information either secure IN the script and/or outside it (if stored and retrieved from outside)

Initially, I had thought to use Lastpass, like I do for web-browsers, and somehow retrieve the credentials in a secure fashion, but they dont have any support for any such thing.

I then studied the Windows data protection API, and the key used to encrypt the password is specific to both the user and the machine that the code is running under, but since I am deploying and setting up VMs and containers a lot, and need to access SOME of the APIs as part of the setup process, I cannot use the method.

So how should I go about doing this?

I heard we can use keepass or those like it, but how would you use them in keeping your credentials safe for your code while retrieving them for your API calls?

What should I do?

Anders
  • 64,406
  • 24
  • 178
  • 215
AdilZ
  • 151
  • 3
  • I know that question asks about an encryption key, and this asks about credentials for an API, but the answer is the same. – Anders Dec 01 '16 at 18:20
  • Only thing I'd add to that answer is "HSM-like" software like KeyWhiz or Hashicorp's Vault - they aren't HSMs, but they may offer a flexible way to store and manage keys that can be accessed through rule based auth – crovers Dec 01 '16 at 18:29
  • I actually don't think this is a duplicate, but more of a continuation of that question in the form of "What is the best way to access stored credentials for service use?" instead of where is best to store it – Robert Mennell Dec 04 '16 at 18:52

2 Answers2

2

As Anders pointed out that there's duplicate, it doesn't well cover your case which differs in a way that it's not key for encryption but password for API.

You can use another web service which would be serving one-time passwords, it could work this way:

  1. Your Python script from server A connects to server B which is "password server"
  2. You send credentials to password server to authorize yourself
  3. You retrieve one-time password (or 1-day password) for API
  4. You connect from server A to server C where API is and you send your one-time, or 1-day password
  5. The API server C checks with server B whatever this password can be used
  6. If they key matches, you get positive response from server C (API) to server A (Python client).

You should be using SSL (TLS) for all connections and verify if the certs are OK to prevent MITM attacks. You could be also using user-certs to authorize yourself along with the password.

You can extend security of such system in many ways, for example, limiting one-time password to your Python script IP address.

Also, your API server C can cache the passwords so it wont be retrieving them from server B every time you make API call (ideally in RAM).

The benefit for security is that you have full auditing on server C, and no permanent passwords, so you don't need to change them.

Aria
  • 2,706
  • 11
  • 19
1

There's a LOT of ways to go over this. A short list includes:

  • Web Service(s)
  • Encrypted local storage
  • Encrypted accessible storage shared between VMs / containers

While some of them have a lot of advanced feature(auditing, full control, much better security) they will often ALL require some sort of set up. So since you mentioned having acess to deploying containers, lets go with a frozen, hard to access method middle route.


The Best Image(s)

Since you can deploy containers easily, this option is to have a container whose SOLE purpose is to send the encrypted auth methods for the scripts and return before dying off that is only accessible from the LAN of the containers in your VPL(Virtual Private Lan) so that it NEVER communicates with the outside world. At this point all that's on the script container is a decryption key. Just with your internal containers and docker machines in your remote area no one can listen to this. This would mean your script would have the decryption key, and only be accessible locally. If it ever gets compromised, upload 2 new container images and you're done.

Pros:

  • Already uses a model you're deploying
  • LAN only. Use a reserved internal IP(172.X.X.X range) on a controller to communicate with this docker
  • Multiple points of security means that if compromised, new credentials can be upload in freshly made, clean docker images
  • EXTREMELY hard to compromise(not the same machine, not the same code, would have to compromise BOTH ends to make it worthwhile)
  • Easily protected(who needs SSH on a container, right?)
  • EXACT port control(don't use common ones, use a really obscure one for your purposes only)
  • Many more

Cons:

  • Two containers to co maintain with encrypted piece, and decryption piece
  • Oh no, two uploads

Since this never goes out over the net no-one outside the network can grab the packets and attempt to do passes to discover it. The list of pros is definitely larger than the small amount of effort it would take to even set this up since it's a container image. Just upload the new code, done.

Now that you have no worry about storing the password on the containers with the script, or the container with the decrypt key you can be assured that at rest it is next to impossible to crack. Especially if you use TLS with cert pinning. Now you have to best of EVERY world, even if the containers somehow end up on the same host.

The only people that could crack this at this point would be you, and anyone else privy to how it was exactly setup. However that's because the most dangerous link in the Cyber Security chain is the Human(s).

Robert Mennell
  • 6,968
  • 1
  • 13
  • 38
  • thanks... this looks to be like an amazing idea... the only problem is, even containers deployment takes time... and the part of the script requiring the same authentication for the same set API...are more in the post-deployment and regular-use end where the authentication call will be done frequently and sporadically based on user demand for the product of the script.... I will be keeping this in mind for future deployment automation schema and definitely using it where I can... This is AWESOME !!! – AdilZ Dec 01 '16 at 19:20
  • regardless.... for now... i think Something like encrypted local storage sounds the most intriguing... If I am assuming correctly, we can encrypt the folder and copy it and also have a separate decryption key ...and therefore i can add that to the deployment specs and once copied.. i can use the decrypt key to access from my script... now if my assumption is true... and this concept possible... how would you recommend I go about it. – AdilZ Dec 01 '16 at 19:32
  • That's the tricky part. Since it will take time to develop any option, which one is better? Any will require overhauling the script to some extent... Unless you can somehow shove a TKIM up onto the server that the images can access, the best you can do is another docker image, or a server to decrypt the stuff on the private lan. At that point what is the advantage of the server over the image? – Robert Mennell Dec 01 '16 at 19:46