4

So a couple years ago when I was learning basic back-end web development, I found a tutorial for creating a basic log in system. I haven't done much modification to the code since, but I have the opportunity now to use a more robust system if I need to.

So here's the code I'm currently using:

$pepper    =    "String of 24 random characters";
$salt       =   dechex(mt_rand(0, 2147483647)) . dechex(mt_rand(0, 2147483647));
$loopcount  =   97674;

for($i = 0; $i < $loopcount; $i++){
    $value  =   hash("sha256", $value . $salt . $pepper);
}

return $value;

Basically, a static 24-character pepper, a salt, and the password are hashed 90,000+ times. It's probably also worth mentioning that the salt is stored in the database.

My biggest question is if hashing it that many times actually does anything. I also want to know if the salt and pepper are strong enough.

Meredith
  • 248
  • 2
  • 8

3 Answers3

12

actually, hashing it MANY times is bad. here is a quote from http://yorickpeterse.com to prove that.

"To cut a long story short, hashing a hash N times doesn't make your passwords more secure and can actually make it less secure as a hacker can quite easily reverse the process by generating hash collisions."

read the full explanation at http://yorickpeterse.com/articles/use-bcrypt-fool/

H3lp3ingth3p33ps
  • 343
  • 1
  • 2
  • 12
  • +1 for using standard means and not rolling your own crypto. – Dmitry Janushkevich Jul 02 '14 at 11:39
  • Agreed, this is the better answer and includes a very good link – AlexH Jul 02 '14 at 12:17
  • 6
    While I agree very much with the conclusion, saying that you shouldn't repeatedly hash is silly. Yes, it reduces the total amount of entropy, but any hashing scheme that becomes breakable after a few percentage points of lost entropy wasn't secure anyway. Bcrypt, the suggested tool, iterates hashing internally *anyway*! – Phoshi Jul 02 '14 at 12:22
  • @Phoshi The point was repeatedly using *weak* hashing is not a way to improve it. – Cthulhu Jul 02 '14 at 13:09
  • @Cthulhu so the answer should have stated that. The way it is, it's wrong. Hashing many times is good if you add the salt (and the pepper, why not). The article is based on a wrong assumption: the 14-printable characters have LESS entropy than the 20 hex-characters hash. – woliveirajr Jul 02 '14 at 13:17
  • @Cthulhu the way the OP proposes it, concatenating random characters (if really random, will include non-printable characters too) avoid the problems that the article tries to explain. – woliveirajr Jul 02 '14 at 13:19
  • 1
    97674 iterations is negligible, and besides, iterations don’t affect collision probability in the way the author of this article seems to think they do. MD5 and SHA-1 are also perfectly fine for hashing algorithms (HMAC-SHA1 is a common choice for PBKDF2). The article also implies that a shared salt is in some way okay, so I’d take it with a grain of… well, that. – Ry- Jul 02 '14 at 19:38
  • @minitech: A point that's often missed with regard to hash collisions is if the number of possible inputs to a hash function is less than the number of possible outputs, the frequency of collisions is reduced considerably. The number of rounds of a random hash function required to lose half the entropy in a hash is hyper-exponential with regard to the hash length. For a 64-bit hash, lost entropy could be a problem, but for a 256-bit hash with decent statistical properties, it isn't. – supercat Sep 13 '14 at 05:14
  • The author is wrong. – eckes Nov 12 '17 at 06:09
3

I would recommend against using sha256 for hashing passwords. The sha2 suite is designed to be fast - exactly the thing you dont want. In short, use bcrypt: https://stackoverflow.com/questions/4795385/how-do-you-use-bcrypt-for-hashing-passwords-in-php

In a proper KDF, iterations are similarly included to slow down the process of password hashing (to answer your question whether or not hashing 90000 times does anything), all of this defends primarily against online brute forcing.

More here: http://en.wikipedia.org/wiki/Key_derivation_function

AlexH
  • 371
  • 3
  • 9
1

The security gain from hashing 90,000+ times is basically minimal.In fact it's actually less secure since any prospective hacker can crack it easier by looking for collisions. You may as well use a higher hashrate (i.e sha512) and therefore have a longer hash rather than just looping through and appending previous results. All your loop does really is increase server load so you'd probably want to improve this code however you can. The salt and pepper are fine though.

Also if you do use a different hash method then make sure to increase the length of the field you're putting it in.

  • So if I switch to sha512 (or bcrypt like AlexH suggested), should I still hash it multiple times? If so, how many times? – Meredith Jul 02 '14 at 11:34
  • 1
    Whichever hashing method you're using, hashing multiple times is pretty useless, at least in my experience. – Apple_Master Jul 02 '14 at 11:36
  • 2
    bcrypt includes iterations within itself, you wont need to call it multiple times, just define the number of iterations – AlexH Jul 02 '14 at 12:17
  • Collisions don’t work this way and iterations are mandatory to make it slow enough, besides it is using the salt in each iteration which totally protects against rainbow table style of collisions. – eckes Nov 12 '17 at 06:12
  • @Meredith You should hash it so often that it takes at least one second or more time to repeat the process. For SHA2 this means 2 millions or more. – eckes Nov 12 '17 at 06:20