Open ghost opened 5 years ago
I was talking to another developer here and we think maybe we can do all our rendering in our rust code and then send the frames to the c++/c# code that does the HolographicStreamerHelpers stuff.
The question of course will be the FFI. DO you do its locally or do we transport the frames over a 10G network nic. Any advice is really useful please to advice us architecturally.
We need to do Remoting (https://docs.microsoft.com/en-us/windows/mixed-reality/add-holographic-remoting) to the Hololens for Medical Resarch project, but its seems that the server side is limited to ensive Microsoft Desktop / Windows 10 ?
Whatever the video codecs used are we can then match then. Same goes for the Event stream from the headset. We are using Rust and vulcan for the rendering on a Visualization Cluster we built ourselves and it allows us to Render very large MRI / fMRI images as well as overlay any other Dicom medical data images.
Please let me know because we need to decide on best MR headset soon for the project.