JuliaLang / JuliaSyntax.jl

The Julia compiler frontend
Other
272 stars 35 forks source link

Bug in tokenizer for certain floating point identifier syntax #378

Closed KristofferC closed 11 months ago

KristofferC commented 11 months ago

Reported by @ChrisRackauckas

in 1.9:

julia> Meta.parse("0x1.62e430p-1f")
0.6931471824645996

but with JuliaSyntax:

julia> Meta.parse("0x1.62e430p-1f")
:(0.6931471824645996 * f)

julia> 0x1.62e430p-1f
ERROR: UndefVarError: `f` not defined
Stacktrace:
 [1] top-level scope
   @ REPL[1]:1

Seems to be an issue in the tokenizer:

julia> JuliaSyntax.tokenize("0x1.62e430p-1f")
2-element Vector{Token}:
 Token(JuliaSyntax.SyntaxHead(K"Float", 0x0000), 0x00000001:0x0000000d)
 Token(JuliaSyntax.SyntaxHead(K"Identifier", 0x0000), 0x0000000e:0x0000000e)
KristofferC commented 11 months ago

I got told that this might just be a bug in the old parser.

c42f commented 11 months ago

Yeah, this is a bug in the old parser where a f suffix was treated as a Float32 suffix in one place in the code, but never actually produced a Float32:

julia> 0x1.62e430p-1f
0.6931471824645996

julia> typeof(0x1.62e430p-1f)
Float64
ChrisRackauckas commented 11 months ago

So how should we be updating our code given that this breaks https://github.com/SciML/DiffEqBase.jl/blob/master/src/fastpow.jl#L59-L62? What's the form that gives the same result on all versions?

KristofferC commented 11 months ago

Isn't it just to remove the f?