0
I am working with a distributed software solution where machines may communicate over the internet. Automatically for sensitive data this means you have to encrypt the traffic. Using a public tool such as gnutls or openssl you can generate CA and machine certificates to encrypt the traffic. I believe (could be wrong) that if you create these CAs and certificates with suitable parameters for security that this would be technically acceptable to safely secure all communication traffic.
I know that it is common practice and important to have certificates signed by a trusted authority (some companies like verisign, digicert). For creating a web server, or some stand alone application this is usually enough. My question is, in a distributed environment where there are 'n' machines (n could be in the hundreds) is it acceptable to generate a CA for the n machines, have that CA signed by a trustworthy authority, then programatically generate TLS certificates for each of the n machines which are signed by the generated CA?
I understand that CA trust has a sort of trickle down effect, and I think this methodology is sound, but am interested in feedback.
3You're going to have a hard time getting a trusted CA to issue you a CA cert. That would give you power to issue pretty much any cert you wanted. You could create your own private CA and just configure all of your machines to trust your CA. This is what most enterprises do to handle this problem. – heavyd – 2016-01-07T23:20:53.890
If you create the certificate, then it does matter, who signs it. You don't need a (default) trusted certificate any certificate can be trusted – Ramhound – 2016-01-07T23:27:30.320
See also: http://serverfault.com/q/274852/2904 Another option, would be to just automate the usage of Lets Encrypt to get a trusted cert from them that auto-renews every few months.
– heavyd – 2016-01-07T23:36:40.550