ant-media / Ant-Media-Server

Ant Media Server is a live streaming engine software that provides adaptive, ultra low latency streaming by using WebRTC technology with ~0.5 seconds latency. Ant Media Server is auto-scalable and it can run on-premise or on-cloud.
https://antmedia.io
Other
4.29k stars 632 forks source link

Second WebSocket for custom messaging to build a lobby for conference #5854

Closed timantmedia closed 10 months ago

timantmedia commented 11 months ago

Is your feature request related to a problem? Please describe.

The only way to send messages to a conference host is through the data channel, but they need to be connected through a peer connection before the data channel can send any messages.

Whats needed is another WebSocket to send messages to a conference host without the need to connect.

E.g. It could be a separate endpoint a module can use to listen for messages separate from WebRTC. ie

wss://ant-media-server:5445/LiveApp/websocket/conference?room=room1

Describe the solution you'd like

The customer has implemented something similar in Wowza streaming engine and this is the functionality that he'd like in Ant Media Server:

Here is the wowza backend I built I modelled it from the Kurento demo . And I have a demo nodejs one for millicast although they support data channel so might be able to use data channel like Ant if it works.

From the customer

In the server-side solution.

  1. The user joins via websocket join message, gets added into a memory map on the server or it could be a database. other already connected participants are notified user joining. The websocket session is stored and used to identify which room the user is in after this.
  2. When the user starts publishing, the server publish start event method makes a call to the websocket module, the host is notified only, and the host starts subscribing to the stream and is added to a lobby. When a user stops publishing a stop event method in the server notifies the websocket module and sends a websocket message to clients to stop subscribing.
  3. Host chooses to add the participant in the lobby to the room with add to room. This is a websocket signal for the server to notify other participants to subscribe to the stream in the room.
  4. The host can choose a participant in the room, to make them "speaker". This is a websocket signal to tell other participants to expand that player to 80% or fullscreen using css. The host can choose a "layout" to position where other participants are placed. I have a few flex css layouts
  5. Host can send a websocket message to a participant to mute them.
  6. when a user toggles their camera on/off it also messages participants via websocket either to show their camera or show a profile picture.
  7. I built a webgl based video mixer the host needs to run to publish the mixed room. I believe there is a possibility doing this in Red5, but it needs to mix logo graphics and titles.
  8. I've upgraded my background removal to use mediapipe's updated API and use webgl shader for efficient rendering. https://electroteque.org/plugins/videojs/rtcstreaming/demos/virtual-background/

Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

Additional context Add any other context or screenshots about the feature request here.

mekya commented 11 months ago

Hi @timantmedia,

I think it'll be much more easier to do that with another streaming application because it's a specific scenario.

The base streaming application creates another endpoint in the application here -> https://github.com/ant-media/StreamApp/blob/master/src/main/java/io/antmedia/enterprise/streamapp/WebSocketLocalHandler.java#L26

You can define another class such as WebSocketConferenceHandler in another streaming application as follows and all custom messages can be implemented in @OnMessage annotated method.

@ServerEndpoint(value="/websocket/conference", configurator=AMSEndpointConfigurator.class)
public class WebSocketConferenceHandler {
....

Let me know if it makes sense.

Cheers Oguz

danrossi commented 11 months ago

I was the requester for this. I will see what I can do with that. I would have to create a bean reference back to the webrtc app so I can notify publish start/stop of the stream also. With Red5 based application this means the app needs to be extended to capture the publish start/stop callbacks. If I can instead listen to stream publish / unpublish server events like with Wowza, and the stream is linked to the "room" as the scope/instance. I can notify the specific room in the list.

So my connection to publish to would be

wss://host/WebRTCAppEE/room1scope

The published stream object would have a scope of room1scope. That notifies via this second custom websocket app the websocket sessions connected to the room in the list room1scope.

In the default "conference" scenario using "joinRoom", to store the participant name and title, that is used to be shown for all participats I have to encode it as the stream name, then decode it. There is other information stored like a profile image when camera is disabled, a property when camera is disabled to notify to display the profile image instead.

Another reason for the custom websocket is to signal others when somebody toggles their camera on/off or changes their device input from no camera to camera. It will toggle the profile picture in the player and video on and off.

So a server side managed list of participants in the room can store custom data separately is better.

For my other server implementations I have some custom example node servers to handle websocket messages but would be good to keep it contained in Ant Media Server.

this._streamInfo = { name: this.config.streamName, title:  this.config.title };
this._streamId = btoa(JSON.stringify(this._streamInfo)).replace("==", "").replace("=", "");

If there is special server events for publishing available that would be good to know. So I dont have to try and extend the WebRTCAppEE to get to them.

danrossi commented 11 months ago

I'm sorry this possibly won't be needed I may be able to send custom commands over the data channel. However when trying to send metadata with the publisher like title and profile image. It doesn't show up in the stream list the metadata is empty. Metadata might work better in the joinRoom command.

It might be easier to create a second websocket managing room lists while also implementing the multitrack feature. But for the conference studio method streams cant be played immediately, host has to choose who gets added to the room. The "default" conference solution will work for the host only, the streams are automatically added to a lobby area underneath.

const message = {
            command : "publish",
            streamId : streamId,
            streamName: streamName,
            token : token || "",
            subscriberId: subscriberId,
            subscriberCode: subscriberCode,
            mainTrack: mainTrack,
            metaData: metaData ? JSON.stringify(metaData) : "",
            video: this.localStream.getVideoTracks().length > 0 ? true : false,
            audio: this.localStream.getAudioTracks().length > 0 ? true : false
        };
danrossi commented 11 months ago

It looks like the stream information metaData is not set. Using the mulitrack demo the metaData is empty on the roomInformation message.

{metaData: null, streamId: 'dLgceZqhQoZSKtxB', streamName: null}

So the metadata set with the publish method is not available.

webRTCAdaptor.publish(publishStreamId, token, subscriberId, subscriberCode, streamName, roomNameBox.value,"{someKey:someValue}");
USAMAWIZARD commented 11 months ago

Hi @danrossi , Let me check this and get back to you. Which version of Ant Media Server are you using ?

USAMAWIZARD commented 11 months ago

Hi @danrossi ,

You can get the meta Data information directly from the broadcast object in the conference.

   webRTCAdaptor.getBroadcastObject(streamid);

Once you call this function server will send websocket message that contained broadcast object which will look some thing like below and it will contain metaData infromation.

  function infoCallback(info, obj) {
 if (info === "broadcastObject") {
      if (obj.broadcast === undefined) { return; }

      let broadcastObject = JSON.parse(obj.broadcast);

Broadcast object returned by server which contains metaData information {"broadcast":"{\"dbId\":{\"timestamp\":1701851959,\"counter\":192682,\"randomValue1\":13678279,\"randomValue2\":23703},\"streamId\":\"asdas_aI5Lk9f1NL\",\"status\":\"broadcasting\",\"type\":\"liveStream\",\"name\":\"asdas\",\"publish\":true,\"date\":1701851959846,\"plannedStartDate\":0,\"plannedEndDate\":0,\"duration\":99321,\"publicStream\":true,\"is360\":false,\"ipAddr\":\"103.171.201.176\",\"speed\":0.0,\"originAdress\":\"10.2.4.0\",\"mp4Enabled\":0,\"webMEnabled\":0,\"expireDurationMS\":0,\"rtmpURL\":\"rtmp:\/\/10.2.4.0\/Conference\/asdas_aI5Lk9f1NL\",\"zombi\":true,\"pendingPacketSize\":0,\"hlsViewerCount\":0,\"dashViewerCount\":0,\"webRTCViewerCount\":0,\"rtmpViewerCount\":0,\"startTime\":1701851960684,\"receivedBytes\":7530205,\"bitrate\":75,\"userAgent\":\"[Mozilla\/5.0 (X11; Linux x86_64) AppleWebKit\/537.36 (KHTML, like Gecko) Chrome\/119.0.0.0 Safari\/537.36]\",\"mainTrackStreamId\":\"tech11\",\"subTrackStreamIds\":[],\"absoluteStartTimeMs\":0,\"webRTCViewerLimit\":-1,\"hlsViewerLimit\":-1,\"dashViewerLimit\":-1,\"currentPlayIndex\":0,\"metaData\":\"{\\\"isMicMuted\\\":false,\\\"isCameraOn\\\":false,\\\"isScreenShared\\\":false}\",\"playlistLoopEnabled\":true,\"updateTime\":1701852060003}","streamId":"asdas_aI5Lk9f1NL","definition":"broadcastObject","command":"notification"}

danrossi commented 11 months ago

Screenshot 2023-12-06 200015

calling getBroadcastObject for every stream in the room streamList doesn't get a message returned.

It's metaData for each stream in the roomInformation streamList property. Set when publishing. Updating metadata using updateStreamMetaData doesn't work either. That would be useful to update information for each participant. Metadata for the room would be nice also found in roomInformation. There is room specific information that needs to be stored.

I may need the second websocket so I can store and update properties including properties for the room.

And not sure if this multitrack feature will work out because. As every participant will get all the stream tracks rather than the ones host choose to be added. Unless there is a command to add streams as tracks to the main track after publishing. So dont publish to the maintrack and host chooses after to add as a track and it will show up for people. Maybe I can make them republish to the maintrack with a datachannel command to be "added to room"

danrossi commented 11 months ago

definitely any getBroadcastInfo command. Doesn't return anything.

USAMAWIZARD commented 11 months ago

Which version of Ant Media Are you using Right now ??

danrossi commented 11 months ago

The build I have is Enterprise Edition 2.7.0 20231031_0626

danrossi commented 11 months ago

this command doesn't return any message either. getVideoTrackAssignments

USAMAWIZARD commented 11 months ago

@danrossi , Lets connect on call to discuss the requirements so that we can provide you a better solution . please schedule a meeting here.

danrossi commented 11 months ago

Im not in your timezone. So updating and obtaining metadata properties for each stream and room is not working now hence its empty ? and calls to get broadcast info and update metadata are now not working ? This would be a reason for another websocket.

USAMAWIZARD commented 11 months ago

Hi @danrossi , can you please try to connect with this websocket endpoint wss://ovh36.antmedia.io:5443/LiveApp/webscoket and let me know if the metadata info is updated , Its my bad I forgot that the meta data is recently added and its in snapshot , Please let me know if it works with this webscoket endpoint. Then I will share a snapshot zip file that you can use to update the server on your end.

USAMAWIZARD commented 11 months ago

Hi @danrossi , Did you get a chance to try with the above websocket URL , I have updated my availability for you can you please check if any time slots works for you on Monday or Tuesday. here.

danrossi commented 11 months ago

It's also empty on LiveApp.

metaData: null,

Im trying to get my implementation to get data channels working but not sending yet. I currently have to encode json data as the streamid. Reason I need this is if camera is toggled before publishing, it needs to be updated on the server so participants get that info and stored in a list. I use it to determine how many tracks to wait for before adding a player in multitrack mode. among other things like profile image that is used when camera is off.

Can the roomInfo have metadata updated for it to store room settings that get delivered to people ? Its used to store current layout modes. Where particpants are placed when one is "pinned", I have 4 different layout options that is based on css selectors. It can also store if somebody is currently "pinned" when joining and play that fullscreen.

I need to also see without needing a server application to allow RTMP streams and server playlists to join the room.

danrossi commented 10 months ago

any idea if that metadata can be updated and obtained from the roomInfo and when calling for info on that stream for new participants arriving that get added via new tracks in multitrack mode. I'll have to update properties in the peer list when new participants join. I have integration working now before I look at host controlled adding participants to the room which might be harder to do in multitrack mode. But trying to integrate data channels insnt working with this server only the demo code works and can't pin point what it needs. messages dont send across to the room multitrack peer subscriber from the publisher peer connection.

the only difference is the streamId is data base64 encoded so I can share properties because metadata is empty. so the streamId is a very long string. The properties is used to get title information and if the camera is enabled.

"eyJuYW1lIjoiQzZMeDZrdTZGRVhnS3R0IiwidGl0bGUiOiJIb3N0IiwiaXNTY3JlZW4iOmZhbHNlLCJpc1N0dWRpb01vZGUiOmZhbHNlLCJpc0hvc3QiOnRydWUsInByb2ZpbGUiOiIuLi8uLi9pbWFnZXMvd293emEtbG9nby5wbmciLCJ1c2VyRGF0YSI6eyJwYXJhbTEiOiJ2YWx1ZTEifSwiY2FtZXJhRW5hYmxlZCI6dHJ1ZX0"

USAMAWIZARD commented 10 months ago

@danrossi let connect on a call to better understand the scenario , Can you please schedule a meeting here I have updated my Availability

I need to also see without needing a server application to allow RTMP streams and server playlists to join the room

Yes the feature is available but currently its only working for Video and not for Audio there is a fix available for this its in PR right now.

This Fix is currently in PR. https://github.com/ant-media/Ant-Media-Server/issues/5569

danrossi commented 10 months ago

I just need the set metaData property showing up for now and the call to update it works. Its empty like above. Its needed to set properties between participants. What version do I need for it ? I need it to know if a users camera is enabled, their profile image to toggle posters on the player, if host has added them to the room, pinned etc.

USAMAWIZARD commented 10 months ago

@danrossi , you need to Upgrade to Ant Media Server 2.8.0 for that which Currently its in snapshot it expected to release next month.

For updating Ant Media Server to latest version please find the documents below. Update AMS Documentation. Ant Media Server Zip file for 2.8.0.

While updating please use -r true when running the update script. If you have any other question or concern please do not hesitate to reach out.

danrossi commented 10 months ago

thanks i'll let you know if I have issues with it. My current version is Enterprise Edition 2.7.0 20231031_0626. hopefully that is enough to get me by without needed a second websocket module. Im having issues getting data channel messages working in my integration so need to figure that out first. Its not being received by the multitrack peer subscriber from the publisher peer.

danrossi commented 10 months ago

I'm sorry this might be a feature request. The roominfo would be nice to have a metaData property that can be updated also. For storing room layout states. So when new people join a room they automatically get the set display layout. ie in my feature position where participants sit when one stream is "pinned" or a screen is shared, left, right, bottom etc. Maybe the room metadata needs to store the "pinned" state also so participants automatically go into that view.

danrossi commented 10 months ago

getBroadcastObject also doesn't return any message. To get metadata from a particular stream on demand. Which will be needed for multitrack mode referenced from the streamid. So when track is added, get broadcast info first.

danrossi commented 10 months ago

I can't get that snapshot requested access to the drive.

USAMAWIZARD commented 10 months ago

Hi @danrossi , I have allowed access , can you please check again .

danrossi commented 10 months ago

Ive updated. The metaData is empty. And getBroadcastInfo doesn't include metadata for the stream. This is the demo code not my integration. On the roomInfo message it would be good to have metadata for the room also. So message to set/get room metadata.

The metadata is on the end of the method

webRTCAdaptor.publish(publishStreamId, token, subscriberId, subscriberCode, streamName, "eyJuYW1lIjoicm9vbTEiLCJ0aXRsZSI6InJvb20xIn0","{someKey:someValue}");

Screenshot 2023-12-21 004753 Screenshot 2023-12-21 004843

USAMAWIZARD commented 10 months ago

@danrossi , Can you please provide me the login access to your server usama.tahseen@antmedia.io

USAMAWIZARD commented 10 months ago

@danrossi to show you that meta data is working fine I have created a Code Pen Example using to publish stream with meta data and updating the meta data information Please take a look at it.

Please make sure you are using 2.8.0 , you can replace the websocket url of your server to test using your server

https://codepen.io/useless-signup/pen/WNmNQzL

Just click on publish and open the console option in bottom left of the page , it will log the meta data information

USAMAWIZARD commented 10 months ago

https://codepen.io/useless-signup/pen/WNmNQzL image

danrossi commented 10 months ago

I'm sorry I meant in the streamList of room information like above. So I have to do individual calls to get the object of each individual subtrack of the maintrack before making a call to play the room stream. So I have a map of the streams before tracks are added. And to store properties for the room it has to be in the metadata of the maintrack. So I dont make calls to get roominformation anymore.

I'm building features in development so the server is in WSL afraid.

danrossi commented 10 months ago

Im finding new tracks when playing back. I won't have that info yet. I have to obtain the info while collecting the tracks then add the tracks then start a player. So I can get the property the camera is enabled. For current streams I get the objects beforehand. Very complicated. It might be easier to send over a custom websocket signal before publish is made. in my other feature, when people join a room before publishing I notify others, so I feel the participant list before anyone starts publishing. when devices are toggled it then updates the properties for everyone. I'll see what I can do alot of promises are required for the object collectors before doing anything.

danrossi commented 10 months ago

Screenshot 2023-12-24 211208 Screenshot 2023-12-24 211612

Using the updated metadata features I have it working again including data channels for host signal to pin a participant. I can also update metadata on the room maintrack stream and other participants can get room configurations from the maintrack metadata.

Before playing the maintrack, I have to collect info of all current subtracks so I have the info gathered already.

For new subtracks is very complicated how it needs to work. When a track is added I have to collect the broadcast info async, but tracks keep being added within that while collecting info. I have to keep adding tracks to the stream while collecting info with a few states to block it running further, then continue async after info is collected. In this time both tracks are added and I add a player which shows the title collected from "getBroadcastInfo"

My last issue is to somehow allow host to choose which tracks get added to the room that is part of the studio feature. Adding and removing streams from the room from a lobby area. So the only way I can see it work is force the participant to republish to a different maintrack. And one maintrack is for the "lobby" area.

peer.on('track', async (e, track, stream, transceiver) => {
            const map = this.idMapping.get(sender);
            try {

                const trackId =  map[transceiver.mid];
                const incomingTrackId = formatTrackId(trackId);

                if(incomingTrackId == sender || incomingTrackId == this.streamId) {
                    return;
                }

                let incomingPeer = this.getOrSetPeer(incomingTrackId, {});

                //console.error("peer",incomingPeer, incomingTrackId);
                if (!incomingPeer.stream) incomingPeer.stream = new MediaStream();

                //if new user joining
                if (!incomingPeer.joined) {

                    incomingPeer.stream.addTrack(track);

                    if (incomingPeer.collecting) return;

                    incomingPeer.collecting = true;

                    //get broadcast object for new user
                    incomingPeer = await this.getBroadcastObject(incomingTrackId);

                    incomingPeer.collecting = false;
                    incomingPeer.joined = true;
                    //incomingPeer = this.getOrSetPeer(incomingTrackId, peerObject);

                    console.log("mew peer ", incomingPeer, incomingPeer.stream.getTracks());

                    if (incomingPeer.cameraEnabled && incomingPeer.stream.getTracks().length < 2) {
                        //haven't got both tracks yet let it gather the second time around
                    } else {
                        this.emit("participantStreamAvailable", incomingPeer, incomingPeer.stream);
                    }

                } else {

                    if (incomingPeer.cameraEnabled) {
                        incomingPeer.stream.addTrack(track);

                        if (incomingPeer.stream.getTracks().length > 1) {
                            console.log("stream ready", incomingPeer);
                            this.emit("participantStreamAvailable", incomingPeer, incomingPeer.stream);
                        }
                    } else {
                        incomingPeer.stream.addTrack(track);
                        this.emit("participantStreamAvailable", incomingPeer, incomingPeer.stream);
                    }

                }

                stream.onremovetrack = event => {

                    const removedTrackId = formatTrackId(event.track.id);

                    console.log("track is removed with id: " + removedTrackId, removedTrackId == this.streamId );

                    if (removedTrackId == this.streamId) return;

                    const participant = this.room.get(removedTrackId);

                    console.log("Remove participant ", participant);

                    if (participant) this.emit("participantStreamUnAvailable", participant.name);

                    this.participantLeft(removedTrackId);

                }

            } catch (e) {
                console.log(e);
            }

        });

Get broadcast info looks like this to be resolved after obtaining the subtrack info. So alot of promises required.

getBroadcastObject(streamId) {

        return new Promise((resolve,reject) => {
            let payload = {
                streamId: streamId,
                transId: Math.random() * 10000,
              }
              payload.resolve = resolve;
              payload.reject  = reject;

              this.transactions.set(payload.streamId,payload);

              //console.log("send command ", payload);

            const message = {
                command: "getBroadcastObject",
                streamId : streamId
            };

            console.log("broadcastinfo ", message);

            this.send(message);
        });

    }

The subtrack info method

handleSubtrackBroadcastObject(broadcastObject) {
        if (broadcastObject.metaData !== undefined && broadcastObject.metaData !== null) {
            let userStatusMetadata = JSON.parse(broadcastObject.metaData);
            let incomingPeer = this.getOrSetPeer(broadcastObject.streamId, userStatusMetadata);
            incomingPeer.joined = true;

            this.transactions.get(broadcastObject.streamId).resolve(incomingPeer);

            this.transactions.delete(broadcastObject.streamId);

        }

    }

Main track info to collect the room config set by the host.

    handleMainTrackBroadcastObject(broadcastObject) {
        this.subTrackPromises = [];

        //dispatch room config metadata
        if (broadcastObject.metadata) {
            const metadata = JSON.parse(broadcastObject.metadata);
            console.log("room state", metadata);
            this.emit("roomState", metadata.state);
            this.emit("roomConfig", metadata);
        }

        if (broadcastObject.subTrackStreamIds) {
            const updatedStreams = broadcastObject.subTrackStreamIds;
            //find and remove not available tracks
            this.currentStreams.forEach(stream => {
                if(!updatedStreams.includes(stream)) {
                    //console.log("participant left ", stream);
                    this.participantLeft(stream);
                }
            });

            this.currentStreams = updatedStreams;

            this.onExistingParticipant(updatedStreams);

            //request broadcast object for new tracks
            updatedStreams.forEach(async(pid) => {
                await this.getBroadcastObject(pid);
            });

        } 

            this.getOrSetPeer(this.multiTrackName, {});

            this.receiveVideo(this.multiTrackName);

    }
danrossi commented 10 months ago

You can possibly close this. However Im trying to implement my feature where host chooses which tracks get added to the room. And only host gets all the tracks / streams into a lobby area.

What does the assignVideoTrackCommand command do ? I need to make a call to add / remove participants so track add and removal in multitrack mode. So I have to store all the tracks and only play them when requested ? Is those tracks being received regardless as if they are in the room if they aren't automatically played ?

USAMAWIZARD commented 10 months ago

@danrossi you can take this approach , Every one will publish there streams to Ant Media Server two different tracks and you can use the below apis to add the Tracks to the room or remove them from the room.let me know if this approach works for you.

I need to make a call to add / remove participants so track add and removal in multitrack mode

https://antmedia.io/rest/#/BroadcastRestService/addSubTrack

https://antmedia.io/rest/#/BroadcastRestService/removeSubTrack

danrossi commented 10 months ago

ok I understand. So there is no websocket api for this and I have to connect api calls with an api key instead. So the maintrack is a secondary main track other participants play instead. And host plays two maintracks. One is to add everyone to the lobby, another one for the room.

USAMAWIZARD commented 10 months ago

Yes you are right you can use this approach please let me know if I should close this.

danrossi commented 10 months ago

update: I have to create the room stream main track with the api before adding subtracks to it. It works adding / removing. They appear for the participant then get removed. While host is still subscribing to the tracks in the lobby stream. For host I make it mirror the streams from the lobby streams instead of resubscribing.

https://rtc.electroteque.org:5443/WebRTCAppEE/rest/v2/broadcasts/create
https://rtc.electroteque.org:5443/WebRTCAppEE/rest/v2/broadcasts/room1-room/subtrack?id=C6Lx6ku6FEXgKtt
danrossi commented 10 months ago

If I add a streamid as a subtrack to the second room maintrack stream. When stopping publishing its still a subtrack of the lobby maintrack publishing to. It says it's still broadcasting also but no stream is broadcasting. When added to the second main room track and stopping publishing. The subtracks for the lobby maintrack stream keep appending so don't clear.

If I have to remove the track from the second maintrack stream aswell as the lobby maintrack stream publishing to before unpublishing it might be better in a websocket. To not have to configure api tokens for every participant page.

Screenshot 2023-12-30 161852

Trying to call to delete the subtrack in the lobby mainstream, its still trailing as a subtrack because I had added it to the room main track. Calling to stop publish should be removing them as subtracks. They keep appending if republish happens and doesnt clear.

https://rtc.electroteque.org:5443/WebRTCAppEE/rest/v2/broadcasts/room1-lobby/subtrack?id=C6Lx6ku6FEXgKtt

Screenshot 2023-12-30 202415

The only way to stop it appending trailing subtracks that are not publishing. Is to remove from one maintrack and add to another. But then I lose the streams from the lobby maintrack.

danrossi commented 10 months ago

There is no track remove event only a "play_finished" event for the maintrack when the only publish to that main track stops. It requires attempting to replay the main track to get further tracks.

danrossi commented 10 months ago

I have an issue also where host can't determine if something is published directly to the room main track unless it also subscribes to the room main track not just the lobby track. So publishing their own stream, publishing the GPU merger stream, subscribing to two main tracks. A websocket message of anyone joining the room, and publishing to that room might help hence the need for the second websocket. I tried sending a data channel message from the screen share publishing to the main room track, but host's own peer connection can't receive a message. Messages only turn up for the subscribed streams. My feature makes screen shares publish to a seperate stream and shows up seperate.

danrossi commented 10 months ago

I'm sorry for all the effort required. I have a working integration. Only there is a few bugs I found detailed above detecting a publisher has ended. If they are the only track in the lobby maintrack there is no track close event.

Then there is the lingering stream ids in the lobby main track if they were previously added/removed from the room maintrack. There seems to be a delay for garbage collection on the main tracks ?

USAMAWIZARD commented 10 months ago

@danrossi , Can you please describe the exact step by step reproduce scenario so that I can reproduce this on my end.

USAMAWIZARD commented 10 months ago

@danrossi if I understand it correctly I have tried to reproduce the issue in this way.

also where host can't determine if something is published directly to the room main track unless it also subscribes to the room main track not just the lobby track. So publishing their own stream, publishing the GPU merger stream, subscribing to two main tracks. A websocket message of anyone joining the room, and publishing to that room might help hence the need for the second websocket.

if you are still considering to go with new web socket instead of using meta data approach please take a look at this example you can implement something like this for a new web socket connection.

You can implement ApplicationContextAware, IStreamListener class and implement the below function which will be called if any one join leave the room so you can keep track of it , take a look at this for sample.

public void streamStarted(String streamId) 
public void streamFinished(String streamId) 
public void joinedTheRoom(String roomId, String streamId) 
public void leftTheRoom(String roomId, String streamId)

So you can implement the above methods and keep track of everyone.