Closed nilx closed 2 years ago
RANDOM_DELAY is set at process startup, by defining a randomized scaling factor https://github.com/cronie-crond/cronie/blob/75843b4bb1509f815eae3bb014748749a8765aae/src/cron.c#L307-L313
RANDOM_DELAY
RandomScale = (double)random() / (double)RAND_MAX;
This is incorrect, as per specs random() returns a value within 0 ... 2^31-1. It's the rand() function which returns a value within 0 ... RAND_MAX.
random()
0 ... 2^31-1
rand()
0 ... RAND_MAX
On Linux/glibc and OSX, RAND_MAX is 2^31-1. But on other systems it's not, as observed on
RAND_MAX
2^31-1
Trivial way to observe it:
#include <stdlib.h> #include <stdio.h> int main (int argc, char** argv) { printf("%lu\n", (1lu << 31) - 1); printf("%lu\n", RAND_MAX); return 0; }
This results in invalid scale factors, as reported in startup logs:
INFO(54211): RANDOM_DELAY will be scaled with factor 4018247% if used.
RANDOM_DELAY
is set at process startup, by defining a randomized scaling factor https://github.com/cronie-crond/cronie/blob/75843b4bb1509f815eae3bb014748749a8765aae/src/cron.c#L307-L313This is incorrect, as per specs
random()
returns a value within0 ... 2^31-1
. It's therand()
function which returns a value within0 ... RAND_MAX
.On Linux/glibc and OSX,
RAND_MAX
is2^31-1
. But on other systems it's not, as observed onTrivial way to observe it:
This results in invalid scale factors, as reported in startup logs: