In an extremely interesting presentation at Puppet Camp London, Tomas Doran suggested a pretty radical approach for keeping everything automated by managing tons of Docker containers with Puppet.
As a security-conscious person, I like the idea of Docker, as everything runs in its own LXC and I configure all services to run as non-root
users.
As a systems-administration-conscious person, I like the idea of Puppet management, as I can keep all configuration in a Git repository and even maintain different environments, all in version control. The advantage is also that I have tear-down-able (is that a word?) environments that I can theoretically rebuild from scratch without too much manual intervention.
However, there are things I'd like not to keep in a Git repository, namely SSL certificates, database passwords, etc.
How do organizations managing massive amounts of machines (like CERN) use provisioning services like Puppet and Chef while still maintaining security? Certain things seem easy, like enforcing permissions on files, but other things seem difficult, like installing SSL keys or SSH host keys, which requires manual intervention.