The answer to this has driven me nuts for a while, but I finally have the answer. It uses Memcached, so it's fast in production.
- In GCP, go to APIs & Services, and enable the Google Datastore API.
- Then setup a Service Account and authenticate according to this tutorial: https://cloud.google.com/docs/authentication/getting-started
- In your local envrionment,
pip install google-cloud-datastore
.
- Then you'll want to create a module in the same folder as
settings.py
called something like datastore.py
:
Here is my datastore.py
:
from google.cloud import datastore
from django.core.cache import cache
class DataStoreClient():
def __init__(self):
self.client = datastore.Client()
def get(self, property):
try:
cache_key = 'env-' + property
result = cache.get(cache_key)
if not result:
key = self.client.key('environment_variables', property)
result = self.client.get(key)
cache.set(cache_key, result, 86400)
return result['Value']
except TypeError:
print(
"{} is not a property in Cloud Datastore".format(property) +
"We are creating one for you. Go to Cloud Datastore to set a value."
)
entity = datastore.Entity(key=key)
entity.update({
'Value': 'NOT_SET'}
)
self.client.put(entity)
Then in the settings, you just call for keys like this:
from your_app.datastore import DataStoreClient
datastore_client = DataStoreClient()
SECRET_KEY = datastore_client.get('SECRET_KEY')
When you run python manage.py runserver
for the first time, the system will populate NOT_SET
values in Google DataStore. From here, you can go to GCP and change those keys on datastore.
This solution is nice in that, once there is a value, it will retrieve it for you.
If you really want to get crazy (as I have), you could set up a script that includes all your environment variables locally, encrypts them with KMS, uploads them, then write the DataStoreClient to only download and decrypt.