4

These are two questions so closely related that I am asking them together - hope that's OK.

When I

  1. create new SSH keys (e.g. with ssh-keygen -t rsa -b 4096 or ... 8192); and when I
  2. create new prime numbers for /etc/ssh/moduli (e.g. with ssh-keygen -G /tmp/moduli-2048.candidates -b 2048; ssh-keygen -a 800 -T /etc/ssh/moduli-2048 -f /tmp/moduli-2048.candidates

I wonder what are the safest but practical options in key size -b xxxx and -a yyyy rounds on a standard PC, given that more bits usually mean better security but are much more difficult to test for security-relevant properties like primality?

Of course, in case 1 it may be that you do not need safe primes - but still "real" primes, and the question is still how to achieve optimal keys, preferably in a few hours PC CPU time (a few days for moduli is OK I guess).

(I wondered whether a question like this is something that could be asked here but concluded that, probably, yes since what I need is not a how-to but advice by specialists on technologies too complex for me to comprehend.)

Ned64
  • 245
  • 1
  • 2
  • 13
  • 2
    See also [Secure Secure Shell](https://stribika.github.io/2015/01/04/secure-secure-shell.html). – Sjoerd Jan 22 '17 at 15:59
  • 2
    See also [Are there any security benefits to deploying custom SSH DH groups to client-only systems?](https://serverfault.com/q/693988/58408) on [sf]. – user Jan 22 '17 at 17:19
  • Thanks a lot for the reading list, I will look at it. However, I do not really think the Server Fault article really addresses the problem of weak primes, does it? The question about server vs. client is also not answered. Furthermore, my question does apply to servers, as well. – Ned64 Jan 22 '17 at 20:08
  • Is there no-one who knows about the most secure bit sizes today? – Ned64 Feb 27 '17 at 09:44
  • Old bump, but as of 2018 standards, 2048-bit DH and RSA seems to be the choice of most security experts. I myself have opted for a 4096-bits or higher for my own purposes and paranoia. The true answer can only be a recommendation, and what _not_ to use. – Dooley_labs Feb 14 '19 at 20:14
  • @Dooley_labs Well, those recommendations usually do not consider the way these keys are generated (hard- and software) but assume "perfect key generation". My question covers these much more practical issues. – Ned64 Oct 06 '19 at 12:31
  • See also [What are ssh-keygen best practices?](https://security.stackexchange.com/q/143442/42391), whose [accepted answer](https://security.stackexchange.com/a/144044/42391) (by me) is still valid today. – Adam Katz Mar 21 '22 at 18:21
  • @AdamKatz Thanks for the interesting link for the first part of my question. Secure Secure Shell is a great site but 7 years old now which is a long time in IT sec. Do you have a reference why it is still valid? Also what about part 2 please: Can a PC generate a prime number of 4k bits with reasonable safety (that a number _is_ a prime number)? What is a good `-a` parameter to filter candidates? Is 2k safer due to the primality which is easier to test? – Ned64 Mar 21 '22 at 22:44
  • If you are communicating with (other) OpenSSH implementations not older than 2012 (or 2014 if RedHat-family), or other implementations that are not ancient or awful, classic aka modp DH is never used because XDH (aka curve25519) or ECDH is preferred, so any generation of moduli is a useless waste. Similarly using ed25519 or any permitted ecdsa for authentication is much more efficient and at least as secure as RSA-3k; ecdsa -b 384 is more secure (equal to maybe RSA-8k or so). – dave_thompson_085 Mar 22 '22 at 01:10
  • @Ned64 – Many of the Secure Secure Shell suggestions (like preferring ED25519) are now the OpenSSH defaults. I am under the impression that `-a` has diminishing returns at some level and that `-a 100` exceeds it, but I don't know how robust that assumption is. – Adam Katz Mar 22 '22 at 01:20

1 Answers1

4

The RSA keys only need to generate two random primes of the appropriate size, so they can be generated fairly quickly. The DH moduli you generate will contain many primes, and these are safe primes which take longer to find (prime p is a safe prime if (p – 1) / 2 is also prime). Using safe primes for Diffie–Hellman moduli protects it from the various security issues. In addition, generating a single 4096-bit DH modulus requires finding a 4096-bit safe prime, whereas generating a single 4096-bit RSA modulus requires finding two regular 2048-bit primes, which is much faster. This, combined with the fact that the modulus file contains many moduli, is why generating DH moduli takes so much longer. For both the RSA key and the DH moduli, you want to select 4096 bits.

The -a option determines the number of KDF rounds used for decrypting your private key on your client with a passphrase. You want to make this as large as possible while not taking so long that it is annoying, assuming you wish to password-protect your private key. What this does is slow down a potential brute force attack against your private key. This flag also increases the number of primality tests done on generated keys to the number you specify, if you are generating DH moduli. Anything above around 64 will be plenty. Letting it use the default (100) is just fine and there's no real reason to increase it. The second half of my answer explains why this is in further detail.

Remember, a large RSA key makes it more difficult for an attacker to falsify authentication, and a large DH modulus makes it difficult to decrypt your session key and read (or modify) encrypted traffic. Using a custom modulus instead of an existing, common one protects against logjam, although not as much as simply choosing a larger modulus (e.g. a custom 1024-bit modulus does not provide as much security against precomputation attacks as using a public 2048-bit modulus).


You also asked in your bounty:

I would like to see algorithmic issues e.g. in the prime tests running on a standard PC considered.

Most prime generation uses the Miller–Rabin primality test, which I explain in more detail in another answer. This is a probabilistic test which can determine that a prime is composite with 100% certainty. It can only determine that a number is prime with at worst 75% certainty.* That is, the test will return either "certainly composite" or "probably prime". Prime numbers will always return "probably prime", no matter how many times the test is run. Composite numbers will only return "probably prime" at most 1/4th of the time. If the test is run multiple times in a row, the chance that a composite number consistently fools the test into thinking that it's prime is astronomically low. OpenSSH uses 100 rounds by default, so the chance that a generated prime is not actually prime is, at worst, 4-100.

There are deterministic algorithms such as the AKS primality test that can determine with 100% certainty whether or not a number is prime, but they are significantly slower and do not provide a meaningful increase in security, so probabilistic tests are used instead. The chance of accidentally generating a composite instead of a prime after 100 rounds is, at worst, the chance of guessing a 200-bit key on your very first attempt (because 4-100 = 2-200)! Knowing that, you can see why a PC can generate prime numbers with more than reasonable certainty that they are indeed prime.

Using -a to adjust the number of primality tests has much less of an impact on the amount of time it takes to generate primes than using -b to adjust the size of the primes. The reason for this is that it specifies the maximum number of primality tests to perform before concluding that the number is indeed prime. The vast majority of numbers are composite, so the vast majority of candidate primes will be discarded after only a few rounds because there's no reason to keep testing a number that has been confirmed to be composite. It doesn't matter all that much if you specify 100 or 1000, because only the numbers that actually are prime will go through (and pass) every test. The first time the Miller–Rabin algorithm returns "certainly composite" for a number, there's no reason to keep testing it and OpenSSH can go on to the next candidate.

* The accuracy of this probabilistic test is actually far higher if you're testing random odd numbers chosen from a uniform distribution, which is the case if you are generating primes. It is proven that a k-bit composite will survive a single round of testing with a chance less than k242-√k for k ≥ 2. Stronger bounds exist for larger numbers, with 2-75 for k = 600!


Another thing to be aware of, which dave_thompson_085 pointed out in a comment, is that modern OpenSSH implementations support better algorithms like Ed25519 for authentication and Curve25519 for key exchange. Curve25519 does not require generating prime moduli to work and Ed25519 keys can be generated far more quickly than RSA keys because it is unnecessary to generate random primes. Both these algorithms are extremely secure and should be preferred when possible.

forest
  • 64,616
  • 20
  • 206
  • 257
  • Custom modulus is neither necessary nor sufficient against logjam. What is needed is >>1024 and corrrectly generated; doing your own generation for OpenSSH is only needed if you think the OpenSSH devs are incompetent or malicious, in which case you shouldn't use OpenSSH at all for anything. Custom only provides limited mitigation if you use <=1024, by making you a less rewarding target. – dave_thompson_085 Mar 22 '22 at 01:12
  • 1
    @dave_thompson_085 A custom modulus makes it more difficult for the attack to scale because the expensive precomputation sieving can't be done once and applied to many key exchanges, but must be targeted at you. Using a larger modulus is a better mitigation of course, and a modulus above about 2900 bits makes even the reduction stage nearly impossible: https://crypto.stackexchange.com/q/76434/54184 – forest Mar 22 '22 at 01:13
  • Put another way, if an 2048-bit modulus becomes breakable in the future (e.g. due to improvements in algorithms to calculate discrete logarithms) for $10 billion, generating your own 2048-bit modulus now will protect you, unless you personally are a target worth an extra $10 billion. So generating your own keys does help vs using the standard Oakley groups unless you assume no cryptographic improvements that would put 2048-bit DH groups in danger of precomputation. – forest Mar 22 '22 at 01:19