There are indeed two properties to a password:
- Complexity (number of possible characters in each position)
- Length (number of positions)
The 'randomness' of a password is simple to calculate: complexity ^ length
, where ^
is exponentiation. As you might know, changing the exponent (the length) makes the number much larger than changing the base (complexity).
For example, a random password using 6 characters, consisting of a-z, A-Z, and 0-9, has a complexity of 62 (26 + 26 + 10) and a length of 6, making 62^6=
~56 billion possible passwords. It is well-known that 6 characters is very insecure for most purposes, even when randomly generated.
For randomness, more is better, up until about 128 bits of entropy. A little more than that helps buffer against cryptographic weakenings of algorithms, but really, you don't want to memorize 128 bits of entropy anyway. Let's say we want to go for 80 bits of entropy, which is a good compromise for almost anything. (Unless you use a password manager, which you should. In that case, just generate 128-bits random passwords and you're good.)
To convert "number of possible values" to "bits of entropy", we need to use this formula: log(n)/log(2)
, where n
is the number of possible values. So if you have 26 possible values (1 letter), that would be log(26)/log(2)=
~4.7 bits of entropy. That makes sense because you need 5 bits to store a letter: the number 26 is 11010
in binary.
A password with a complexity of 62 needs 14 characters to reach our target of 80 bits:
log(62^14)/log(2)=
~83.4 bits of entropy.
Example: c21FApmUsptwfd
If you add all possible ASCII symbols, you get a complexity of about 95. These passwords are annoying to type, even harder to remember than without symbols, and you still need 13 characters:
log(95^13)/log(2)=
~85.4 bits.
Example: ~2YPCi.%$6u,.
If you add words, your password becomes much longer, but is slightly easier to remember. The calculation is similar: number of possible elements to the power of the number of elements. If you have a dictionary of 7000 words and pick 6 random words, you have 7000^6
possible combinations. That is:
log(7000^6)/log(2)=
~76.6 bits of entropy.
Example: cardigans Malthusian's acorns glows unconfirmed uncluttered
You can combine them: if you have four random words and three random digits in between, you have 10^3 * 7000^4
possible values. Again, the entropy calculation:
log(10^3 * 7000^4)/log(2)=
61.0 bits of entropy.
Example: foreshadow3sectionals2palm6deliberating
It is simple to use the math and make a table of possible combinations/lengths/complexities and compare their strengths. You can mix properties until you find a combination that you like, and check that it gives you enough strength. But to answer your question generally: length beats complexity.
Finally, it should be noted that the example you used is not as strong as N random words: the 6-word sentence "OneDayIWentToWalk" makes sense. The 6-word sentence "OneIDayWentToWalk" does not make sense. Someone guessing which passphrase you used can eliminate all the possible nonsensical sentences and try only the grammatical ones. That reduces the number of guesses required by a lot. This question at the Linguistics StackExchange explores the randomness of grammatical phrases. (Credits to Tezra for sharing that link.)
Note: this answer uses parts from another answer I posted in a different question earlier today.