microsoft / MixedReality-WebRTC

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience
https://microsoft.github.io/MixedReality-WebRTC/
MIT License
898 stars 278 forks source link

Forward video stream from shared memory via MR WebRTC #775

Closed SilverLive closed 2 years ago

SilverLive commented 3 years ago

Any other video streaming library (live555, ffmpeg, etc) is able to write incoming streams to the shared memory. I´d like to grab this data and forward it via WebRTC. The reason why I need this is, that all those libraries are really problematic to make it work with unity in a HoloLens device, but I need to receive their streams in such a device anyways. Is this possible with MR WebRTC?

Thanks in advance ! Best Regards Johannes

SilverLive commented 2 years ago

To answer my personal question. Yes it is, but not with existing WebRTC implementations. With the help of MemoryMappedFiles (see Microsoft Documentation) or any other SharedMemory apporach it is possible to load single frames to shared memory. The only thing you have to do at MR WebRTC is, that you have to write a new customVideoSource. Within there, you have to add some code within UpdateBuffer() Method, which reads the byteArray, pixelHeight, pixelWidth and the arrayCount and all other important information about the current frame from shared memory. Afterwards you can calculate the pixelformat-dependend values (for example I420a needs different strides, chromasize, lumasize and so on) within OnFrameRequest() method. Afterwards complete OnFrameRequest() with request.CompleteRequest(frame), where frame contains the specified parameters of I420aVideoFrame or RGBAVideoFrame. Others are not implemented so far. To keep this message short, just check out existing VideoSources from mrWebrtc. Afterwards you should be fine.