Data entropy depends on the observer - there is no absolute measurement of entropy. It's even questionable as to whether or not anything in the universe it at all random, and "randomness" (or, more precisely, related to entropy, unpredictability) is the source or entropy.
Unpredictability being the operative term: hard for somebody to predict.
If you use the Mersenne Twister, for example, knowing the seed of the random sequence perfectly predicts the entire sequence - so your "random" password consists of 64 bits of entropy (if you use the 64-bit version, that is).
If you use dice ware, then the entropy stems from the number of times you rolled the dice and that's it.
Unfortunately, by the time it becomes a "password", the source of entropy is obscured.
For example: a three-value safe code where each value is in the range [0,99]
has 3*log2(100)
bits of entropy. Until you learn that they selected a 6-character word and used a phone keypad to turn it into numbers, and now the entropy is log2(numberOfSixLetterWords)
.
In short, the assumptions used to make a password are so fundamental to its entropy and so obscured by the immediate appearance of the password that you really cannot estimate it; you can only ever get an upper-bound of the password's entropy.