Closed ErikMcClure closed 3 years ago
This reproduces on Linux on my machine.
Reproduces on osx as well; disassembly shows it's because it casts to bool by (x & 0x1) != 0
, but turns not [bool](x)
into int8(x) == 0
. So (2 & 0x1) != 0
is false, and int8(2) == 0
is also false. This also means that 4 is false, 5 is true, 6 is false, etc-- positive evens are false, and positive odd are true. This is pretty bizarre, but is this Terra's fault, or it is LLVM's?
// return [bool](x)
define zeroext i1 @"$b"(i32) {
entry:
%1 = and i32 %0, 1
%2 = icmp ne i32 %1, 0
ret i1 %2
}
// return not [bool](x)
define zeroext i1 @"$nb"(i32) {
entry:
%1 = trunc i32 %0 to i8
%2 = icmp eq i8 %1, 0
ret i1 %2
}
I am 99% sure this is Terra, not LLVM, because the only way LLVM could do something weird is if you told LLVM to do something weird, like cast an i32 to an i1, when Terra should instead be telling LLVM to do a comparison against zero. When I built my webassembly compiler, I told LLVM to do a proper comparison when converting an int to a boolean and it worked fine, which means Terra is telling LLVM to do something super bizarre here.
I agree. We're generating the wrong LLVM IR.
It shouldn't be too bad to fix, but it does require someone to dig into the LLVM codegen side of the compiler.
Seriously, what the fuck is going on?
Tested using de24c6 on Windows.