Open danilogr opened 4 years ago
Follow-up: Would it be possible to receive the RTP packets in a callback? Perhaps I can bring FFMPEG externally jus to decode the h264 stream?
The existing H.264 support via Media Foundation was written as UWP-specific code but as I understand the API is not, so the best way I see would be to make that code work with Desktop too. Unfortunately we don't really have resources to do that right now. However this is a good suggestion and certainly something we want to do eventually, if only because the discrepancy caused issues for many users already, me included.
One other piece of information is that @LoadLibrary, who is managing the WebRTC UWP project, is looking at porting the existing UWP code to Google's repository. As I understand, his changes might make Desktop support available too. So hopefully we would be able to take that work and integrate it into MixedReality-WebRTC.
I don't feel like the raw RTP packet route is the best way, as this would not really integrate into the design of libwebrtc
. In any case I never had a look at this, and a priori doubt there are callbacks available for us to extract this raw data. If you really want to go down the FFMPEG route (and please take care of licensing issues for OpenH264 if you do so) then I think implementing an encoder/decoder module for libwebrtc
is a much better way than trying to surface the raw RTP packets and manage them outside of the libwebrtc
framework. You can have a look at how the existing H.264 encoder/decoder based on Media Foundation is integrated. To be clear though we most likely wouldn't take this change in because of licensing issues with FFMPEG and OpenH264, which is the same reason as I understand why Google is not compiling with OpenH264 anymore by default (you have to explicitly set a flag during the libwebrtc
build to show consent and understanding about licensing).
Hi, has there been any update on this? I'm also looking to get H264 working via WebRTC in a WPF app
For our research project, we ended up choosing an "easier" route:
The WebRTC-Broker connects through the MixedReality-WebRTC module to the HoloLens, receives H264 content, decodes it using Media Foundation, and then streams the decoded video to the Windows Desktop App through a TCP connection. Both applications are running on the same computer, so there is little to no latency there.
To connect the WebRTC-Broker with the Windows Desktop App, the WebRTC broker binds a TCP Server to localhost. This TCP Server waits for connections and forwards decoded frames to any clients connected to it. While this solution seemed trivial at first, it didn't work because of UWP's Network Isolation features. I.e., UWP forces apps running on the same computer to communicate using a different approach, and blocks any local UWP/UDP servers.
After some digging, we figure out that we can go around network isolation features. To do so, we have to invoke an admin prompt and run the following command:
CheckNetIsolation.exe LoopbackExempt -a -n=<AppContainer or Package Family>
(See this link for more information on this command)
After that, we had our high-res, low-latency HoloLens2 FPV showing up quite nicely! I do not recommend this solution for an enterprise application, but for a research project it works pretty well ;)
Also, I have to give kudos to @tlsharkey as he is responsible for figuring this out and implementing this solution on our side!
Adding to @danilogr 's comment above:
If you want an enterprise solution, the 'right' way of implementing app to app communication in windows is now through App Services. I believe they will work with both UWP and regular desktop apps.
Additionally, the Network Isolation in UWP only exists locally. You can stream the decoded video to another computer without creating a loopback exemption for your app. The exemption is only needed if you forward the data to another app on the same computer.
Thanks for your replies! I'll have a go at implementing a UWP broker app in-between, then :)
Hello everyone! First and foremost, thank you so much for all the effort that you put into this library. I've been streaming video to/from HoloLens since 2016, and back then I had to do a bunch of dirty work with ffmpeg to get everything running. This library is a huge step up for everyone working on Communication&Collaboration with HoloLens.
H264 decoding is currently not supported on Windows Desktop, but it would be very useful for the ones deploying desktop (e.g.: VR) apps instead of UWP apps.
Could you please shed some light on the possible ways of supporting H264 decoding on Windows Desktop?
Thanks!