Seeing a strange issue where the MSB is always set in every byte but the first when using i2cReadSync.
For example, reading the firmware version from a SL030 RFID module results in the buffer: <Buffer 0c f0 80 d3 cc b0 b3 b0 ad b4 ae b0>
Bytes 1 and 2 are correct. Byte 3 should be 00. Bytes 4-12 represent an ASCII string of the firmware version, and the value is correct IF the MSB is removed. The expected buffer is: <Buffer 0c f0 00 53 4c 30 33 30 2D 34 2e 30>
Another example: reading the tag information gives a Buffer of <Buffer 07 81 80 fd d6 f5 ec 81>
Here, byte 1 is correct (the length). Bytes 2 and 3 should be 01 and 00. Byte 4 and 5 are correct because the true values should have the MSB set. The rest of the bits mistakenly have their MSB set.
I'm gonna try to dig into the C++, but no promises.... :)
Seeing a strange issue where the MSB is always set in every byte but the first when using i2cReadSync.
For example, reading the firmware version from a SL030 RFID module results in the buffer:
<Buffer 0c f0 80 d3 cc b0 b3 b0 ad b4 ae b0>
Bytes 1 and 2 are correct. Byte 3 should be 00. Bytes 4-12 represent an ASCII string of the firmware version, and the value is correct IF the MSB is removed. The expected buffer is:
<Buffer 0c f0 00 53 4c 30 33 30 2D 34 2e 30>
Another example: reading the tag information gives a Buffer of
<Buffer 07 81 80 fd d6 f5 ec 81>
Here, byte 1 is correct (the length). Bytes 2 and 3 should be 01 and 00. Byte 4 and 5 are correct because the true values should have the MSB set. The rest of the bits mistakenly have their MSB set.I'm gonna try to dig into the C++, but no promises.... :)