jdibenes / hl2ss

HoloLens 2 Sensor Streaming. Real-time streaming of HoloLens 2 sensor data over WiFi. Research Mode and External USB-C A/V supported.
Other
203 stars 51 forks source link

Compatibility, performance & other #92

Closed anjelomerte closed 6 months ago

anjelomerte commented 6 months ago

Hi jdibenes! Thank you for making your work open source. It helps the research community very much!

Personally, I am currently capturing sensor data and storing it locally on the HL2. I would love to test out your implementation. Before I do that, I just have a few questions to clarify whether I understand the capabilities of your implementation correctly. Having in mind that I would like to access multiple sensor streams such as front camera footage (+audio), depth data, hand-/head-/eye tracking data at once:

  1. Does the application still run "smoothly" in that sense that a framerate of min. 30fps is achieved when streaming all that sensor data simultaneously? I am trying to figure out which apporach suits my use case the best: Transmitting data immediately or storing it locally first and then transferring it. If the application stability is affected I can make a tradeoff with real-time. If not, why not make it (close to) real-time anyways and stream data immediately:)
  2. I would like to incorporate the plugin in an existing Unity3D 2022 project using MRTK3. Do you see inherent problems in that regard?
  3. Is the python client version indeed supported on Windows as well? At some points in the docu it says Linux only but then also Windows, Mac OS seem to be supported. Can you confirm?
  4. Is it possible to manually (programmatically) start/stop the sensor data extraction & streaming when using the plugin for Unity3D?
  5. Regarding eye tracking data, I have seen that origin & direction are extracted. To get 2D gaze coordinates I would image sth like projecting the gaze rays onto the depth map (e.g. AHAT). Is that realistic?

Thank you very much for your work again! Looking forward to your feedback! Best regards

jdibenes commented 6 months ago

Hello,

1) PV+AHAT depth+Spatial Input+Audio should run smoothly. Video streams are the most demanding and in our tests, PV+AHAT works OK. 2) I think it should work but I have not tested the plugin in Unity 2022 so I am not sure. If there are any issues hopefully it will only require some small changes to hl2ss.cs. 3) Yes, Windows is supported. 4) Streaming is controlled by the client. See the example Python scripts. We use client.open(), client.get_next_packet(), and client.close() to start the stream, acquire frames, and stop streaming. For our project, we run a Unity application (with the plugin) on the HoloLens and then we run our Python code on a desktop PC to capture and process HoloLens data. 5) To get 2D gaze coordinates, we do raycasting with the Spatial Mapping mesh to get the ray length like the StreamRecoder sample from the HoloLens2ForCV does.

anjelomerte commented 6 months ago

Thank you for your input, will try it out then!