jdibenes / hl2ss

HoloLens 2 Sensor Streaming. Real-time streaming of HoloLens 2 sensor data over WiFi. Research Mode and External USB-C A/V supported.
Other
213 stars 53 forks source link

Record AHAT AB #100

Closed Zulex closed 5 months ago

Zulex commented 6 months ago

First of all, thanks for this great recourse!

I am trying to capture the 512x512 images from the AHAT AB sensor (so the IR reflectivity sensor). In the "Simple_recorder.py" the AHAT sensor is converted to an mp4 in which this data seems to be lost and only the depth image is saved. The AB image from the long throw is of lower resolution.

When using the same script for the long throw, this piece of code enables getting the depth and AB images: # Unpack RM Depth Long Throw to a tar file containing Depth and AB PNGs --- if (hl2ss.StreamPort.RM_DEPTH_LONGTHROW in ports): hl2ss_utilities.unpack_to_png(filenames[hl2ss.StreamPort.RM_DEPTH_LONGTHROW], os.path.join(path, 'long_throw.tar'))

I tried implementing the same for the AHAT images: #trying AHAT AB if (hl2ss.StreamPort.RM_DEPTH_AHAT in ports): hl2ss_utilities.unpack_to_png(filenames[hl2ss.StreamPort.RM_DEPTH_AHAT], os.path.join(path, 'ahat_throw.tar'))

However I'm only getting completely black images back as results. The png's are however 512x512 and do change in filesize, so it does seem to be saving some sort of data. When diving deeper into the code, it seems like AHAT doesn't implement an to PNG configuration. Before trying to implement the same for AHAT, I figured there must be an easier way to record the AB images.

And secondary questions, the AHAT AB video stream looks darker than expected, as the hand are not visible any more at 50cm+ distance, is that expected?

Any hints in how to record 512x512 AHAT AB data would be greatly appreciated!

jdibenes commented 6 months ago

Hello, I tried your implementation and it seems to work. Depth images look completely dark because AHAT depth range is ~1000 mm so maximum "brightness" is about 1000/65535. AHAT AB does look darker than Longthrow AB, taking the sqrt of the image and converting to 8-bit color depth seems to improve visibility.

Zulex commented 5 months ago

@jdibenes ,

Thanks for the response!

I do now see that many papers do indeed mention that the AB sensor data is brightened for visibility, which luckily means nothing is wrong with my sensors.

I tried brightening the image through Python by just taking the power to two (see image below). However, as you can see the image has many weird artefacts. For sanity, I used Photoshop to brighten the image, and I got the same artefacts.

When using the streaming script, I do not see these artefacts. So any other implementation would be highly appreciated.

ab_177

Zulex commented 5 months ago

I was overcomplicating this. Using cv2.imwrite in the streaming application did the job.