I rencently seen this :
I can't figure out, how he computes 28 bits of entropy for a password like "Tr0ub4dor&3" seems really few...
I rencently seen this :
I can't figure out, how he computes 28 bits of entropy for a password like "Tr0ub4dor&3" seems really few...
He's modeling the password as the output of a randomized algorithm similar to this one:
The entropy is a function of the random choices made in the algorithm; you calculate it by identifying what random choices the algorithm makes, how many alternatives are available for each random choice, and the relative likelihood of the alternatives. I've annotated the numbers in the steps above, and if you add them up you get about 28 bits total.
You can see that Munroe's procedure isn't hard science by any means, but it's not an unreasonable estimate either. He's practicing the art of the quick-and-dirty estimate, which he very often demonstrates in his work—not necessarily getting the right number, but forming a quick idea of its approximate magnitude.
Each small square is a bit of entropy that's being accounted.
There is some reasoning about it. For example, when the password requires caps, almost everybody put the caps in the first letter. So you don't get much more than just a bit of entropy out of it.