We regularly use 10000 iterations of SHA256 for hashing passwords.
If we want to have similar security, how many rounds/work factor should we use when hashing passwords with bcrypt?
We regularly use 10000 iterations of SHA256 for hashing passwords.
If we want to have similar security, how many rounds/work factor should we use when hashing passwords with bcrypt?
I'd say even bcrypt cost 5 (the minimum cost) is a good bit stronger than 10K rounds of SHA256 ... but that's still much lower than you want.
Attacking 10K rounds of SHA256, no GPU suite I'm aware of currently supports it (though MDXfind will on CPU) - but a custom hashcat kernel or JtR fork could be created pretty easily. And since GPUs can do billions of SHA256 operations per second, even 10000 rounds isn't going to help you much to resist attack. So it's good that you're upgrading. :D
So instead of "like for like", your goal should be to calibrate your bcrypt cost for general attack resistance.
You also need to account for the 'thundering herd' effect - how many simultaneous reauthentications you need to support (how many users might need to log back in at once).
Generally, bcrypt cost 10 should be a minimum here, and 12 (or more) if you can get away with it. But to tune your bcrypt costs, it's probably best to simulate real workloads yourself - generate the hashes, see how they perform under expected workloads, and tune the bcrypt to be as slow as you can tolerate (with whatever headroom makes sense for your use cases).
P.S. I've seen a rule of thumb that users can tolerate a ~300ms login delay before they start to notice. So that might be useful as an upper bound of your target bcrypt tuning. The goal should be to make it as slow as you can get away with.
We regularly use 10000 iterations of SHA256 for hashing passwords.
If we want to have similar security, how many rounds/work factor should we use when hashing passwords with bcrypt?
It depends on the computer you are using. But, I will give an example.
For example, suppose you are using a computer that can compute 10,000,000 SHA256 hashes per second (as in this example). And suppose that same computer can compute one Bcrypt hash with Cost=4 in 0.001 seconds (as in this example).
Then, based on the above timing information, if you want a Bcrypt hash system that takes approximately the same time as your current 10,000 SHA256 hash system, you would use Bcrypt Cost=4 (or in your language work factor=4). This is because, in the example systems introduced above, the 10,000 SHA256 hashes take approximately 10,000/10,000,000Hz = 0.001 seconds, which is the same time as the Cost=4 example above.
Note: I am not recommending that you use Bcrypt with Cost=4. I am just answering the question you asked. Furthermore, I had to supply some of the details on my own to fill in missing relevant details in your question. So, I am also not saying that this Cost=4 is correct for your server--just for the example server I introduced above.
If you want to do better you can treat each integer increase in the Cost of the bcrypt as providing an extra factor of 2 in time.