Closed AdisonCavani closed 2 years ago
Like they said the entropy in this library before 4.0.1 was a sloppy use of the term
. Password entropy is calculated with
E = L * log2(R).
While L
is the length of the password and R
is the size of the character pool. This has nothing to do with a guess count calculated by some library.
You can check it out in more detail here https://www.omnicalculator.com/other/password-entropy
I think there are different reasons why you getting a different "entropy" than the old library.
The math changed how the guesses are calculated and how the sequences are found
For example the new page has only the sequence bruteforce
for 't3XKczXFIOrqHRr_'
while the old one has many different one
dictionary
t3X
dictionary
I
dictionary
Or
bruteforce
KczXF
bruteforce
qHRr_
The entropy was calculated for every sequence found and then added For example:
Math.log2(100000) + Math.log2(100000) = 33.219280948873624
Math.log2(200000) = 17.609640474436812
At the end the "entropy" you calculated log2(10000000000000000) = 53.15
is correct for the new guess count with the new sequences. You would just need to do this for the guess count of the sequences found instead of the overall value for guesses and add them up
But this "entropy" is worthless and doesn't say anything because it's not even the "real" entropy of a password.
The
password entropy
property was removed in 4.0.1 release. I was wondering how can I bring back the old method of calculating password entropy in bits. I've read the method used for calculating was like this:I tried to do this, but I'm getting totally different results.
This page is using the old method. This page is using the new method. I'm using them for comparison
Password:
t3XKczXFIOrqHRr_
Guesses (new):
10000000000000000
Guesses_log10 (new):16
Entropy (old):
90.578 bits
My calculations:
log2(10000000000000000) = 53.15
What's a proper formula for calculating entropy from
guesses
?