Closed SilverLive closed 3 years ago
To answer my personal question. Yes it is, but not with existing WebRTC implementations. With the help of MemoryMappedFiles (see Microsoft Documentation) or any other SharedMemory apporach it is possible to load single frames to shared memory. The only thing you have to do at MR WebRTC is, that you have to write a new customVideoSource. Within there, you have to add some code within UpdateBuffer() Method, which reads the byteArray, pixelHeight, pixelWidth and the arrayCount and all other important information about the current frame from shared memory. Afterwards you can calculate the pixelformat-dependend values (for example I420a needs different strides, chromasize, lumasize and so on) within OnFrameRequest() method. Afterwards complete OnFrameRequest() with request.CompleteRequest(frame), where frame contains the specified parameters of I420aVideoFrame or RGBAVideoFrame. Others are not implemented so far. To keep this message short, just check out existing VideoSources from mrWebrtc. Afterwards you should be fine.
Any other video streaming library (live555, ffmpeg, etc) is able to write incoming streams to the shared memory. I´d like to grab this data and forward it via WebRTC. The reason why I need this is, that all those libraries are really problematic to make it work with unity in a HoloLens device, but I need to receive their streams in such a device anyways. Is this possible with MR WebRTC?
Thanks in advance ! Best Regards Johannes