The "& 0xFFFF" code gets translated to "% (uint16)65536" due to the dtype of A. But 65536 == 0x10000, and when casted to uint16, becomes 0. Not sure if the backend really generates "% 0" code but this is undefined (and possibly why the output is not updated -- although would have expected a runtime failure).
Why not just leave it as bitwise_and?
Code:
Output:
The "& 0xFFFF" code gets translated to "% (uint16)65536" due to the dtype of A. But 65536 == 0x10000, and when casted to uint16, becomes 0. Not sure if the backend really generates "% 0" code but this is undefined (and possibly why the output is not updated -- although would have expected a runtime failure). Why not just leave it as bitwise_and?