Closed toxicwind closed 1 year ago
Oh nice. I'll check this out when I am back home. Sounds nice.
Just for reference, the current randomness is fine for small wildcard files, the secrets library randomness only matters when you have a wildcards file that has a massive amount of lines, I noticed that the numpy random was causing a "noticeable" random.
Just for reference, the current randomness is fine for small wildcard files, the secrets library randomness only matters when you have a wildcards file that has a massive amount of lines, I noticed that the numpy random was causing a "noticeable" random.
I noticed the same thing, like my popular location lists and natl parks, just repeating same "set" of places/parks.
Well it turns out that even the secrets module isnt random enough, the code changes above use SystemRandom which is supposedly cryptographically reliably random...
Doesn't appear to be seeded which is a deal breaker. Need to be deterministic.
I think the best way to improve randomness within X calls is probably simply a history utilizing the nodes persistent class to store X previous results, and ensuring a new result isn't within it (assuming total wildcards is above X history), then clear out oldest results from history as it's max size is exceeded.
Added the
secrets
module for superior randomness.secrets
module provides a stronger level of randomness, making our selections less predictable and more varied.Modified file reading to handle encoding errors gracefully.
errors="ignore"
parameter, we can prevent crashes due to unexpected encoding issues.Switched from
random.choice()
tosecrets.choice()
for selecting lines.secrets.choice()
ensures a higher degree of randomness when picking lines, enhancing the unpredictability of our selections.