josephnhtam / live-streaming-server-net

A .NET implementation of RTMP live streaming server, supporting HTTP-FLV, WebSocket-FLV, HLS, Kubernetes, cloud storage services integration and more.
https://josephnhtam.github.io/live-streaming-server-net/
MIT License
74 stars 11 forks source link

question: stream camera in browser to rtmp server using javascript #40

Closed amirhosseini01 closed 2 weeks ago

amirhosseini01 commented 2 weeks ago

Hi, thank you for your great project.

I know it's impossible to stream a camera directly from the browser to an RTMP server, and it should be done with WebRTC or other methods. Something like this: browser -> WebRTC -> media server -> FFmpeg -> RTMP

Could you please share any implementation examples if you know of any?

I have already run the HLS and admin panel sample projects. I successfully streamed video using an FFmpeg command to RTMP and displayed the stream in the browser using an HTML video tag with HLS.js, as you mentioned in one of the issues.

But I'm lost! I don't know how to stream the camera to the RTMP.

One more thing that I'm not sure about: As a second option, can we use Blazor WebAssembly/Server to run an FFmpeg command inside the browser?

Thank you very much.

Edit: Is it possible to send every single frame to the SignalR server (subject), assemble those blobs on the server, and send them to the RTMP server again?

josephnhtam commented 2 weeks ago

Hi @amirhosseini01

Typically, the most efficient way to stream a video from a browser is to use WebRTC. However, this library currently doesn’t support WebRTC ingestion. I would like to add WebRTC ingestion as well, but I can’t provide an ETA at the moment.

Nevertheless, having a relay server to republish the stream to an RTMP server is feasible.