Ahoy.
I have created a new function, nwipe_check_entropy.
Explanation of the entropy.c File
The file entropy.c contains several functions to evaluate the entropy, which measures the randomness of a 64-bit value (uint64_t), in our case for both the AES-CTR /AES-XTS data quality, as well as the initial PRNG seed from /dev/urandom. These functions are used to determine whether a given 64-bit value contains sufficient randomness, which is crucial for security-related purposes, lilke secure data wipe. Below is an explanation of the algorithms used and the conditions that determine a pass or fail.
In the current implementation, in case of a failure of /dev/urandom the program will bail out and print a fatal log message. I'll work on alternative seed sources, so we can use them instead, but it requires a bit of work as the current entropy is kinda hardcoded.
Algorithms Used
Shannon Entropy (shannon_entropy):
Purpose: Measures the information content based on the distribution of bits (0s and 1s). Higher entropy indicates higher randomness.
Approach:
Counts the number of 0s and 1s in the 64-bit number.
Calculates the probability of a bit being 0 (p0) and 1 (p1).
If all bits are the same (p0 or p1 is 0), returns an entropy of 0.0 (minimum entropy).
Bit Frequency Test (bit_frequency_test):
Purpose: Checks the proportion of 1s in the number. For high entropy, this proportion should be close to 0.5, indicating an even distribution of 0s and 1s.
Approach:
Counts the number of 1s in the 64-bit number.
Calculates the proportion of 1s (count_ones / N), where N is 64.
Runs Test (runs_test):
Purpose: Checks for sequences (runs) of identical bits. A random sequence should have a moderate number of runs, indicating that the bits alternate regularly.
Approach:
Counts the number of runs in the sequence of bits (a run is a sequence of consecutive identical bits).
Starts with the least significant bit and checks each bit against the next one, counting how many times the bit value changes.
Auto-correlation Test (auto_correlation_test):
Purpose: Checks how often consecutive bits are the same. A low correlation implies higher randomness.
Approach:
Compares each bit with the next one in the sequence.
Counts the number of matches (where consecutive bits are the same).
Returns the proportion of matches to total comparisons.
Conditions for Pass or Fail
The main function nwipe_check_entropy combines the results of the above methods to decide whether a number has sufficient entropy. The following conditions must be met for a pass:
Shannon Entropy: Must be greater than 0.9. This indicates high unpredictability in the bit distribution.
Bit Frequency: Should be between 0.3 and 0.7, indicating that there is no extreme bias towards 0s or 1s.
Runs Test: The number of runs should be between 10 and 54, suggesting a balance between too few and too many runs.
Auto-correlation: Must be less than 0.7, indicating that consecutive bits are not overly correlated.
If all these criteria are met, the function logs a message indicating sufficient randomness and returns 1 (pass). If any criterion is not met, it logs an error message indicating insufficient randomness and returns 0 (fail).
Ahoy. I have created a new function,
nwipe_check_entropy
.Explanation of the
entropy.c
FileThe file
entropy.c
contains several functions to evaluate the entropy, which measures the randomness of a 64-bit value (uint64_t
), in our case for both the AES-CTR /AES-XTS data quality, as well as the initial PRNG seed from/dev/urandom
. These functions are used to determine whether a given 64-bit value contains sufficient randomness, which is crucial for security-related purposes, lilke secure data wipe. Below is an explanation of the algorithms used and the conditions that determine a pass or fail.In the current implementation, in case of a failure of
/dev/urandom
the program will bail out and print a fatal log message. I'll work on alternative seed sources, so we can use them instead, but it requires a bit of work as the current entropy is kinda hardcoded.Algorithms Used
Shannon Entropy (
shannon_entropy
):p0
) and 1 (p1
).-p0 * log2(p0) - p1 * log2(p1)
.p0
orp1
is 0), returns an entropy of 0.0 (minimum entropy).Bit Frequency Test (
bit_frequency_test
):count_ones / N
), whereN
is 64.Runs Test (
runs_test
):Auto-correlation Test (
auto_correlation_test
):Conditions for Pass or Fail
The main function
nwipe_check_entropy
combines the results of the above methods to decide whether a number has sufficient entropy. The following conditions must be met for a pass:If all these criteria are met, the function logs a message indicating sufficient randomness and returns 1 (pass). If any criterion is not met, it logs an error message indicating insufficient randomness and returns 0 (fail).