ffmpegwasm / ffmpeg.wasm

FFmpeg for browser, powered by WebAssembly
https://ffmpegwasm.netlify.app
MIT License
13.47k stars 776 forks source link

RTMP live transcoding is available? #100

Open Gexa opened 3 years ago

Gexa commented 3 years ago

I would like to ask if this possible to transcoding for example a live webcam to RTMP stream, like in CLI ffmpeg version. For example the incoming data from camera directly transcoded by FFMpeg.wasm to an RTMP server?

Main question is, is it possible to do something like that: mediaRecorder.ondataavailable = async (data) => { ffmpeg.transcode_to_RTMP_server(new Uint8Array(await (new Blob([data])).arrayBuffer())) }

Thank you very much!

jeromewu commented 3 years ago

Currently the answer is no. I am stilling researching for how to integrate RTMP/RTSP library with ffmpeg.wasm, will announce once I have found the way. 😄

agaripian commented 3 years ago

@jeromewu thank you for working on such a great project. If you find a way to do rtmp in the browser that would be a huge breakthrough for live streaming.

rckprtr commented 3 years ago

This could be extremely impactful to the live streaming community with developing browse based tools. Could direct stream to Twitch/Youtube/Facebook straight from the browser without downloading OBS/Streamlab.

meetwudi commented 3 years ago

I did a bit of digging and found that Emscripten internally uses WebSocket to simulate POSIX TCP messages. This breaks RTMP for obvious reasons because RTMP servers don't understand WebSocket right out of the box.

There is a way to do this with a proxy server, but that requires a separate process being run locally. We might be able to run our own proxy server (transforms websocket -> tcp) and stream to our proxy server. This won't make it entirely "server-less" but it does take the heavy-lifting transcoding away from servers.

pratik9722 commented 3 years ago

@tjwudi is not possible to take WebRTC webcam media or screen share by getUserMedia() WebRTC API into ffmpeg.wasm then directly stream to youtube to any RTMP server?

ciaoamigoschat commented 3 years ago

@tjwudi is not possible to take WebRTC webcam media or screen share by getUserMedia() WebRTC API into ffmpeg.wasm then directly stream to youtube to any RTMP server?

news?

pratik9722 commented 3 years ago

mediarecorder api is also there? But how can we stream those recording to youtube or any rtmp server

HamptonMakes commented 3 years ago

It's unlikely this will ever be possible, as WASM isn't allowed to post any requests that aren't going through normal browser security rules and protocols, and browsers just don't support RTMP network protocol at all.

oyed commented 3 years ago

This might be a job for a separate sister framework, but it'd be cool to implement a similar system within something like Electron. Since you can securely stream WebRTC from the Renderer process (Web) to a Preload/Main process (Node), I'd guess there is an existing library to create an RMTP stream over NodeJS (If not, possibly a Node Binary could fit in there?)

Just speculation, but would be awesome to see.

EDIT: Obviously this wouldn't be the "all-in-browser" approach people want, but the web is still the web, it takes a lot of time and extreme patience for anything like an entirely new protocol to be added

lr-mjaouen commented 3 years ago

Maybe with electron, as @hcatlin said I don't think wasm will open any new possibility to support RTMP from a browser

alew3 commented 3 years ago

This would be very cool as currently there isn't any simple way to go live from the web to a rtmp streaming servers without an intermediary service.

rafael2k commented 2 years ago

How about live webcam streaming to an icecast2 server? ffmpeg does support indeed icecast2 output. And of course, from an icecast2 server, one can send the stream to any other destination.

davedoesdev commented 2 years ago

Live streaming via HLS here: https://github.com/Kagami/ffmpeg.js/pull/166

rafael2k commented 2 years ago

Great! Could you share a sample HLS streaming html / js pipeline? ; )

rafael2k commented 2 years ago

One question: DASH is also enabled, right?

davedoesdev commented 2 years ago

@rafael2k yes I've got a sample app but it's not quite ready yet. Re DASH - as you can see from the commits, I did start with DASH but I didn't have an ingestion URL to test. It'd be possible to put it back in if I could test it.

rafael2k commented 2 years ago

Hi @davedoesdev ! I'm really willing to test DASH, and can help. I'm still in my infancy with WebAssembly, but indeed I'm a long time C developer. I did not even manage to compile ffmpeg.wasm in my Debian Bullseye (11) with Emscripten installed via apt-get. It seems to me an ad-hoc build script is made for an external installed toolchain, and not native installed one. But I can already make Makefiles for my projects and compile them to .wasm (so for now - if you can provide binaries, I can start playing sooner). Also, please point me which repo to (try to) build from. Is a plan to merge back ffmpeg.wasm to main ffmpeg repo?

davedoesdev commented 2 years ago

@rafael2k let's take it to https://github.com/davedoesdev/ffmpeg.js/issues/1

I compile using emsdk: https://emscripten.org/docs/getting_started/downloads.html

davedoesdev commented 2 years ago

Demo here: https://rawgit-now.netlify.app/davedoesdev/streamana/publish/site/streamana.html Source: https://github.com/davedoesdev/streamana

Demo streams to YouTube Live using camera and microphone captured using getUserMedia (all in browser).

MuneebChaudhry-dev commented 1 month ago

Currently the answer is no. I am stilling researching for how to integrate RTMP/RTSP library with ffmpeg.wasm, will announce once I have found the way. 😄

@jeromewu Is this still pending?