Open timo92700 opened 4 years ago
See #39 - You need to figure out how Janus is doing its signaling, and write some implementation for it for MixedReality-WebRTC. Then you can use MixedReality-WebRTC, for example the C# library, to write a UWP app that can connect to Janus. This is an interoperability issue, and we do not offer support for Janus (nor any specific media server solution) at this time.
Thank's for your reply, i'm gonna check this :)
Hi @timo92700, did you find a solution ? I'm trying to build a similar system. The idea would be to stream the Mixed Reality Capture from HoloLens to a Janus server so i can record it and transfer the stream to an external Unity app.
To do it this way, we took the hololens streaming view from the api : http://192.168.0.XXXXX/api/holographic/stream/live_high.mp4?holo=true&pv=true&mic=false&loopback=false&RenderFromCamera=false
You can type this URL in VLC media player, then with the vlc built-in tools you can "stream" (emit) this video input feed, and connect through other vlc on the network ( or unity video player ) using the streamer ip address and specified port. And you will then be able to receive the hololens's video feed on other devices on the network ( with some additional delay ). I stopped working with janus server because it's only a peer to peer system.
Thanks for your response. That's actually a good idea. So you couldn't connect multiple peer with Janus ?
Yeah looks like it unfortunately. Btw i'm waiting for the UWP bindings of vlc libs atm to display video stream in a built uwp apps in hololens. Other ways are too weird or not really what im looking for. To make it work i currently send video feed frame per frame and apply them to unity texture 30 times per seconds, it works pretty good but costs a lot of resources.
What library do you use to send your frames ?
Hm i get the input video and use a python script to cut it in frame, then directly send the frames to the application with TCP or websocket or udp protocol.
Hi, I have setup a SFU server ( janus ) then i can stream my screen / webcam as video input on server-side. I can connect a browser to my server ( using 1 or n browser on different machines ) and receive the video + audio. Did someone developped a client, in uwp, able to receive these content ( like a browser in uwp apps ) ? Is there any solution ? Or an idea to how to do this ? Maybe i have to retro-engineer the web-rtc protol and create texture from data received to build the player ? Or maybe i have to wait for the vlc media player library be available on uwp, then i can just connect to the http url ?