Closed KristofferC closed 11 months ago
I got told that this might just be a bug in the old parser.
Yeah, this is a bug in the old parser where a f
suffix was treated as a Float32 suffix in one place in the code, but never actually produced a Float32:
julia> 0x1.62e430p-1f
0.6931471824645996
julia> typeof(0x1.62e430p-1f)
Float64
So how should we be updating our code given that this breaks https://github.com/SciML/DiffEqBase.jl/blob/master/src/fastpow.jl#L59-L62? What's the form that gives the same result on all versions?
Isn't it just to remove the f
?
Reported by @ChrisRackauckas
in 1.9:
but with JuliaSyntax:
Seems to be an issue in the tokenizer: