Hi,
Mako 030B has 644x482 pixels. If I read a frame with Mono12Packed mode it should give me 644x482*1.5 uint8 that I can convert to 644x488 uint16 (12 bit dept).
frame.buffer_data() contains correct number of bytes and I can convert that to a 16bit image. if I try to use frame.buffer_data_numpy() I find that it has 644x488 uint8, and that is just first 2/3 of the frame.buffer_data. It is impossible to recreate full 12bit depth image from it.
If you need I can send you examples of frame data.
Hi, Mako 030B has 644x482 pixels. If I read a frame with Mono12Packed mode it should give me 644x482*1.5 uint8 that I can convert to 644x488 uint16 (12 bit dept). frame.buffer_data() contains correct number of bytes and I can convert that to a 16bit image. if I try to use frame.buffer_data_numpy() I find that it has 644x488 uint8, and that is just first 2/3 of the frame.buffer_data. It is impossible to recreate full 12bit depth image from it. If you need I can send you examples of frame data.
Ivan