Open tomkooij opened 5 years ago
This kind of calculation does not seem to be used very much (I looked for '% int(1e9)') in SAPPHiRE, only a few simulations and in time_deltas, but in both cases it appears that python ints are used, so that should not be an issue.
@153957 : I removed CRITICAL
from the title. Maybe not so critical afterall.
It does bug me that this breaks when using numpy.uint64
vs int
. (~This might actually be a numpy.uint64 bug/flaw~ EDIT: No, this is expected behaviour: int % numpy.uint64
will be evaluated by first casting both values to float
. With the loss of precission.). We must cast everything to int
before calculating the modulus.
I'd like to fix this by creating a sapphire.utils.split_ext_timestamp
function that is safe for int
and numpy.uint64
. And just import / use that everywhere (also in publicdb).
But I just fixed it in publicdb.api.datastore
for now.
from numpy import uint64
ext_timestamp = 1547942403_452849057
np_ext_timestamp = uint64(ext_timestamp)
ext_timestamp == np_ext_timestamp # True
# timestamp
ext_timestamp // int(1e9) # 1547942403
np_ext_timestamp // int(1e9) # 1547942403.0
# nanoseconds
ext_timestamp % int(1e9) # 452849057
np_ext_timestamp % int(1e9) # 452849152.0
np_ext_timestamp % uint64(int(1e9)) # 452849057
# Using numpy <= 1.24
1400000002_000000050 == uint64(1400000002_000000049) # True
# Using numpy >= 1.25 <- correct answer
1400000002_000000050 == uint64(1400000002_000000049) # False
For comparisons (uint64 == int) the newer numpy does not cast to float64, but when performing calculations (uint64 + int) it does convert the values to float64. 🤯
See code below:
An
ext_timestamp
read from HDF5 isnumpy.uint64
with different math rules (appearantly) as compared toint
...This might break a lot of (all) code that transforms
ext_timestamp
intots, ns
pairs.(Fix: cast to
int
)