When setting the length of a Slice using c.Uint data (i32) in LLGo, an incorrect comparison occurs when the length is read and compared later in the code. This leads to unexpected behavior, such as executing loop bodies even when the Slice length is 0.
Investigation
The issue arises because when setting the Slice length, the c.Uint data is input as i32. However, when reading and comparing the length later, the len (i64) data of the slice is directly read. This type mismatch causes the comparison to fail, leading to incorrect program flow.
The generated IR code for the relevant part is as follows:
In the IR code, we can see that the Slice length is stored as i32 (%9 and %10), but when extracting the length (%12) and performing the comparison (%14), it is treated as i64, leading to the incorrect comparison result.
When setting the length of a Slice using c.Uint data (i32) in LLGo, an incorrect comparison occurs when the length is read and compared later in the code. This leads to unexpected behavior, such as executing loop bodies even when the Slice length is 0. Investigation The issue arises because when setting the Slice length, the c.Uint data is input as i32. However, when reading and comparing the length later, the len (i64) data of the slice is directly read. This type mismatch causes the comparison to fail, leading to incorrect program flow.
output
The generated IR code for the relevant part is as follows: In the IR code, we can see that the Slice length is stored as i32 (%9 and %10), but when extracting the length (%12) and performing the comparison (%14), it is treated as i64, leading to the incorrect comparison result.
IR Code (uint) However, when using the uint type instead of c.Uint, the generated IR code stores the length as i64:
In this case, the Slice length is stored as i64 (%9 and %10), and the comparison (%14) is performed correctly using i64, avoiding the issue.