2

I am doing a bunch of test/data scraping projects across a few different platforms that consume read-only and not really important API keys. I can only foresee minimal damage if they were to get out.

With that said, I cant see any glaring reason vault couldn't distribute them safely is there a risk I'm missing using Vault past the demark?

Samuel Philipp
  • 640
  • 6
  • 18

2 Answers2

0

Well, one consideration is potential cash overflow attacks possibly. If an attacker has access to private API keys of yours that function like this:

  • Make request to API
  • API logs request and charges your account 1/200 usages
  • Every usage past 200 usages costs $0.05

An attacker can cause you to go into financial problems like that. See: https://www.owasp.org/index.php/Cash_Overflow

Some other abuse cases for APIs may be found here: https://www.owasp.org/index.php/Category:API_Abuse


Recommendations? I highly recommend AWS secrets manager or Azure confidential compute. They can both different levels of assurances for the security of your API keys in different ways.

https://aws.amazon.com/secrets-manager/ - Works by providing an API that you can store other API keys in securely using FIPS compliant encryption, then all you have to secure is your secrets manager secret keys/api info.

https://azure.microsoft.com/en-us/solutions/confidential-compute/ - Works by computing secure operations in a TEE (Trusted Execution Environment), a cryptographically secure and logically separated processing unit.

leaustinwile
  • 366
  • 1
  • 8
  • Thats why I wanted a way to better manage the secrets than env vars and leaving it out in the ether but I use prepaid debit card for those anyway JIC – Kyle Sponable May 03 '19 at 20:31
  • I highly recommend AWS secrets manager or Azure confidential compute. They can both different levels of assurances for the security of your API keys in different ways. https://aws.amazon.com/secrets-manager/ https://azure.microsoft.com/en-us/solutions/confidential-compute/ – leaustinwile May 03 '19 at 20:34
0

Assuming that you plan on your Vault being accessible to the public Internet, and that you have enabled audit logging:

Consider a disk-space-based denial of service.

If you enable Vault's audit logging, Vault will log every request, including those that fail authentication. Vault will hash any secrets that may be present in those requests, but will leave any non-secrets (like the keys of an entry in Vault's key/value store) alone.

If you are using the file audit device, these logs will be written to a file. If an attacker tries to repeatedly write the secret {"(insert ten megabytes of noise here)":"1"} to Vault, you may run out of disk space.

Mitigation: configure your audit device(s) (and any helper infrastructure) so that at least one audit device won't block. For example, configure logrotate with a maximum file size.

PlasmaSauna
  • 574
  • 3
  • 6