ScarletsFiction / SFMediaStream

HTML5 media streamer library for playing music, video, playlist, or even live streaming microphone & camera with node server
MIT License
102 stars 30 forks source link

Can I have multiple transmitters and listeners at the same time? #9

Open gcandamo opened 4 years ago

gcandamo commented 4 years ago

Hello, I have been following your project for a long time. I find it very interesting, I was using it to transmit live audio but only unilaterally (one transmitter and multiple listeners). Now I want to go to the next level, I try to have multiple emitters and listeners at the same time.

That the user is an audio transmitter and receiver at the same time, and that they also listen to what other users are emitting something like discord

Is it possible to do this with your project?

And if so, how could I do it?

Thank you for giving us this project, which is really very useful.

StefansArya commented 4 years ago

Of course, it's true that the example is about one presenter into multiple streamer by saving the bufferHeader on the node server. And every new user who want to stream it will request the bufferHeader first. But it's not limited to modify it into multiple presenter and multiple streamer by develop your own media router on the server.

Well, maybe like saving the bufferHeader on presenter's browser and send it to the new connected user and receiving that user's bufferHeader. Then you might want to control who allowed to stream from User A, User B, or maybe from Room C. That's why you need to route the media flow.

p2p

You can use SocketIO and send it just like chatting, but if you care about server performance you will need to build media router on lower level language like C, Rust, or Golang.

The current library version will need every streamer to receive the first buffer because it contain the media header, and then everyone can play the audio. But currently I'm stuck with the video stream 😅 (out of workaround), I can use WASM to take full control but I have no time to work alone.

Anyway thanks for using the library, I'm also using this for developing a social media.

philmetzger commented 4 years ago

So @StefansArya have I got this right. You can have 1 presenter start a stream. When someone connects they receive the header and can listen and also talk at the same time? Or do they also need to send out a header to the presenter to talk? But from what I can tell, you can only set one buffer header (setBufferHeader) or does that concatenate? How would this work for a kind of group voice chat? i.e. multiple connected users all in the came channel talking?

StefansArya commented 4 years ago

Yes you will need one presenter to start a stream, but it's one way communication only. Someone who receive the buffer header can only listen to presenter who created it, just like a radio. Usually one user will need to create 1 Presenter instance, and multiple Streamer instance to do group call. But your opinion seems pretty interesting, because it may be possible to concatenate buffer on a single Streamer instance if the every Presenter have similar bufferHeader (but the timing will not correct, so I wouldn't recommend).


Well, let's say user A,B,C is going to do a group call. In this case, every user need to create 1 Presenter, and 2 Streamer instance.

User A receive bufferHeader from user B and set it into first Streamer instance (on user A), then receive bufferHeader from user C and set it to second Streamer instance (on user A). In that way user A can listen to user B and C, but can't talk each other.

User B receive bufferHeader from user A and set it into first Streamer instance (on user B), then receive bufferHeader from user C and set it to second Streamer instance (on user B).

Now user B can talk each other with user A and can listen to user C, but user C still can't listen to user A and B because he also need to receive bufferHeader from user A and B.

Maybe I should add more example for this library, but sadly I need to finish some business outside 😅