-1

Would using a 256-bit binary string, for e.g.

0000001100101011100101110101101000101000011001011010000111010100000011101100011111110010001000101010010010101101111100010011100111011100110110101001111001110110100011001010100110101101010000010100110000110011011010111101001110010101110010010001000001101001

be as strong as the underlying ASCII password it would represent?

Would a 256-bit binary string, imply the same strength as 256-bit security?

Woodstock
  • 679
  • 6
  • 20

3 Answers3

2

The encoding of the value (32 bytes of binary, 256 bytes of ASCII ones and zeros, 64 bytes of hex, 44 bytes of base64, etc.) does not affect the security of the key in any way.

The important distinction between cryptographic keys and passwords is that when humans choose passwords, they generally only get about 20 bits of entropy. That means there are about 1 million possible passwords they could have chosen. A dictionary attack that tries the common things people do (add numbers, l33t speak, etc.) will break most passwords. Also, people reuse passwords.

A cryptographic key is generated by a computer, not by a human, and it must be generated using a CSPRNG. Reading from /dev/urandom on linux is perfectly fine. CryptGenRandom on Windows is fine. A cryptographic key generated this way genuinely has the amount of entropy that is assumed by its size. So if it is 256 bits long, then is has 256 bits of entropy. No dictionary attack or using the birthday of the user's kids or the password the user used on their luggage would work at breaking it.

So although the API might accept any 32 bytes as a cryptographic key, using a human generated password directly will break security and should be flagged in any audit/review.

There are ways to securely use passwords for cryptography (see PAKE), but the correct thing to do almost always is to generate a secure cryptographic key using a CSPRNG, almost always using the CSPRNG provided by the OS kernel.

The real amount of entropy and possibility of reuse also dictate how the value is stored: human generated passwords have to be stored using secure password storage (designed to be slow to thwart brute force dictionary attacks on GPUs, argon2, scrypt, bcrypt, PBKDF2) while high entropy computer generated keys/tokens don't need that and can be stored with a single hash (only if all you need is to verify it when it is provided from outside. If you need that key in the clear, use a secure way to manage keys).

There are ways to encode strong cryptographic keys in a way that humans can memorize them. This is rarely needed, but some people want to carry a secure electronic wallet in their head. For that, see BIP-39 and things like it.

To try to explain how much entropy is in 256 bits, I'll say that it means 100 throws of a truly fair 6 sided dice or alternatively, 24 words chosen from a list of 2048 words in such a way that knowing any 23 of the 24 words would not help the attacker in any way: their chance of guessing the missing 24th word is still exactly 1/2048 (that means for example that words can repeat). It is very difficult for humans to generate that much entropy.

Z.T.
  • 7,768
  • 1
  • 20
  • 35
  • Thank you, so a 256bit cryptographically secure string is a good password? – Woodstock May 21 '19 at 15:33
  • Sure, but then you need a password manager to remember it (which you should do!). Often people mean "something I can easily remember" when they say "password". 128 bits are often (usually) enough though. – Z.T. May 21 '19 at 15:40
1

Without considering any other factors (like how the bits are generated), any encoding of 256 bits should be equally difficult to brute force as any other encoding, and your 256-bit string as just another encoding. With that said, from a practical standpoint the underlying password hashing function the site you are logging into may have a maximum length smaller than 256. For example, bcrypt implementations may have maximum lengths between 50 and 72, which would mean only the first 50-72 bits of your string would actually be used in the hashing. On the other hand, PBKDF2 has a maximum length of the input string equal to the maximum length of the underlying hash, which the linked answer points out is practically infinite for this purpose (e.g. 2^61-1 bytes for SHA-1).

IllusiveBrian
  • 343
  • 1
  • 6
0

In ASCII, the first bit of each byte is 0. Thus, a 256 bit string of ASCII contains at most 224 bits encoding information. Moreover, generally only the 95 printable characters are used, reducing the encoded information to at most 210 bits.

The information encoded will reach this maximum only if the content was generated from a uniformly random source of independent bits. If the content of the string contains words or come from a human mind, expect far less information in the string.

However, if the 256 bit string is not limited to ASCII characters and is completely generated from a uniform random source of independent bits, this string will contain 256 bits of information and provide a level of security of 256 bits.

A. Hersean
  • 10,046
  • 3
  • 28
  • 42