AgoraIO-Community / Agora_Unity_WebGL

WebGL plugin for Unity, beta release
MIT License
112 stars 25 forks source link

[BUG] OnUserEnableLocalVideo callback event does not get triggered properly #172

Closed hoomoongoos closed 1 year ago

hoomoongoos commented 1 year ago

I've been trying to subscribe a function to the OnUserEnableLocalVideo callback to show or hide each user's canvas when they start or stop streaming their webcam, but the event doesn't seem to be registering properly.

I tried subscribing this next function to the callback event to test if it actually worked, but the console only outputs the Debug.Log message once, as soon as a new user joins the room... Plus, the value of the var "enabled" prints as true, even though I set EnableLocalVideo to false on each user's Awake().

void OnUserEnableLocalVideo(uint uid, bool enabled) { Debug.Log($"{uid} - {enabled}"); }

rtcEngine.OnUserEnableLocalVideo += OnUserEnableLocalVideo;

image

After the user joins the room, enabling/disabling their local video doesn't trigger the callback function or print anything else on the console anymore.


Full code:

`

public class AgoraManager : MonoBehaviourPunCallbacks {
string appID = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";

public static AgoraManager Instance;

private IRtcEngine rtcEngine;

private static bool isMuted = false;
public static bool isDisplayingWebcam = false;

// A list of dimensions for switching
VideoDimensions[] dimensions = new VideoDimensions[]{
    new VideoDimensions { width = 640, height = 480 },
    new VideoDimensions { width = 480, height = 480 },
    new VideoDimensions { width = 480, height = 240 }
};
int dim = 0;
private const float Offset = 100;

private void Awake() {
    // Singleton
    if (Instance != null)
        Destroy(gameObject);
    else {
        Instance = this;
        DontDestroyOnLoad(gameObject);
    }

    // Inits the RTC Engine and subscribes some functions to the RTC Agora callbacks
    InitEngine();
}

private void Update() { RequestPerms(); }

#region Initialize
// Function that inits the RTC Engine
void InitEngine() {
    rtcEngine = IRtcEngine.GetEngine(appID);

    rtcEngine.SetChannelProfile(CHANNEL_PROFILE.CHANNEL_PROFILE_LIVE_BROADCASTING);
    rtcEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER);

    rtcEngine.EnableAudio();
    rtcEngine.EnableVideo();
    rtcEngine.EnableLocalVideo(false);
    rtcEngine.EnableVideoObserver();

    rtcEngine.OnJoinChannelSuccess += OnJoinChannelSuccess;
    rtcEngine.OnUserJoined += OnUserJoinedHandler;
    rtcEngine.OnLeaveChannel += OnLeaveChannel;
    rtcEngine.OnUserEnableLocalVideo += OnUserEnableLocalVideo;
    rtcEngine.OnError += OnError;
    rtcEngine.OnConnectionLost += OnConnectionLostHandler;

    SetVideoEncoderConfiguration(0);
}

// Returns the RTC Engine
public IRtcEngine GetRtcEngine() {
    return rtcEngine;
}

// Requests Microphone perms until the user accepts them
private void RequestPerms() {
    PermissionHelper.RequestMicrophontPermission();
    PermissionHelper.RequestCameraPermission();
}

public void SetVideoEncoderConfiguration(int dim = 0) {
    if (dim >= dimensions.Length) {
        Debug.LogError("Invalid dimension choice!");
        return;
    }

    VideoEncoderConfiguration config = new VideoEncoderConfiguration {
        dimensions = dimensions[dim],
        frameRate = FRAME_RATE.FRAME_RATE_FPS_15,
        minFrameRate = -1,
        bitrate = 0,
        minBitrate = 1,
        orientationMode = ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE,
        degradationPreference = DEGRADATION_PREFERENCE.MAINTAIN_FRAMERATE,
        mirrorMode = VIDEO_MIRROR_MODE_TYPE.VIDEO_MIRROR_MODE_AUTO
    };
    rtcEngine.SetVideoEncoderConfiguration(config);
}
#endregion

#region MicSwitchState
// Function that reverts the mute state of the user's audio input source
public void Mute() {
    isMuted = !isMuted;
    UIManager.Instance.ToggleUIButtonColor("mic", isMuted);
    rtcEngine.MuteLocalAudioStream(isMuted);
    Debug.Log($"You are now {(isMuted ? "muted" : "unmuted")}!");
}

// Function that sets the mute state of the user's audio input source
public void MicSwitchState(bool _isMuted) {
    isMuted = !_isMuted;
    this.Mute();
}
#endregion

#region CamSwitchState
public void OnButtonCameraSwitchActive() {
    if (!isDisplayingWebcam)
        MakeVideoView(0);
    else
        DestroyVideoView(0);
}

public void CamSwitchState(bool isDisplay) {
    isDisplayingWebcam = !isDisplay;
    OnButtonCameraSwitchActive();
}
#endregion

#region Webcam Video Streaming
public void MakeVideoView(uint uid) {
    GameObject go = GameObject.Find(uid.ToString());
    if (!ReferenceEquals(go, null))
        return; // Reuse

    // Create a GameObject and assign to this new user
    VideoSurface videoSurface = MakeImageSurface(uid.ToString());
    if (!ReferenceEquals(videoSurface, null)) {
        // Configure videoSurface
        videoSurface.SetForUser(uid);
        videoSurface.SetEnable(true);
        videoSurface.SetVideoSurfaceType(uid.ToString() != "0" ? AgoraVideoSurfaceType.Renderer : AgoraVideoSurfaceType.RawImage);
    }
}

private VideoSurface MakeImageSurface(string goName) {
    GameObject go;

    if (goName != "0") {
        go = GameObject.CreatePrimitive(PrimitiveType.Plane);
        go.transform.name = goName;
        Destroy(go.GetComponent<MeshCollider>());

        GameObject userGO = SpatialAudio.GetPlayerFromUID(goName);
        go.transform.parent = userGO.GetComponent<SpatialAudio>().webcamParent;
        go.transform.localPosition = Vector3.zero;
        go.transform.localRotation = Quaternion.Euler(Vector3.zero);
        go.transform.localScale = Vector3.one;

        MeshRenderer mr = go.GetComponent<MeshRenderer>();
        mr.shadowCastingMode = UnityEngine.Rendering.ShadowCastingMode.Off;
        Material mat = new Material(Shader.Find("Unlit/Texture"));
        mat.color = Color.white;
        mr.material = mat;
        return go.AddComponent<VideoSurface>();
    }
    else {
        if (!isDisplayingWebcam) {
            rtcEngine.EnableLocalVideo(true);
            Debug.Log($"Webcam data stream started!");

            isDisplayingWebcam = true;
            UIManager.Instance.ToggleUIButtonColor("cam", isDisplayingWebcam);
            UIManager.Instance.canvasProxy.cameraImage.transform.parent.gameObject.SetActive(true);
            return UIManager.Instance.canvasProxy.cameraImage.GetComponent<VideoSurface>();
        }
        else return null;
    }
}

public void DestroyVideoView(uint uid) {
    if (uid != 0) {
        GameObject go = GameObject.Find(uid.ToString());
        Destroy(go);
    }
    else {
        rtcEngine.EnableLocalVideo(false);
        Debug.Log($"Webcam data stream stopped!");

        isDisplayingWebcam = false;
        UIManager.Instance.ToggleUIButtonColor("cam", isDisplayingWebcam);
        UIManager.Instance.canvasProxy.cameraImage.transform.parent.gameObject.SetActive(false);
    }
}
#endregion

#region Callbacks [Photon] (Syncs Photon and Agora rooms)
// On Error callback [PHOTON]
public override void OnJoinedRoom()  {
    if (PhotonNetwork.CurrentRoom.Name != "Main")
        rtcEngine.JoinChannel(PhotonNetwork.CurrentRoom.Name);
}

// On Left Room callback [PHOTON]
public override void OnLeftRoom() {
    rtcEngine.LeaveChannel();
}
#endregion

#region Callbacks [Agora]
// On Join Channel Success callback [AGORA]
void OnJoinChannelSuccess(string channelName, uint uid, int elapsed) {
    Debug.Log($"Channel {channelName} joined!");

    // Creates Player's custom properties as hashtable
    Hashtable hash = new Hashtable();
    hash.Add("agoraID", uid.ToString());
    PhotonNetwork.SetPlayerCustomProperties(hash);

    // Refreshes the player last scene's mic and cam state to the new one
    this.MicSwitchState(isMuted);
    this.CamSwitchState(isDisplayingWebcam);
}

void OnUserJoinedHandler(uint uid, int elapsed) {
    Debug.Log($"User {uid} joined!");
    //MakeVideoView(uid);
}

// On Leave Channel callback [AGORA]
void OnLeaveChannel(RtcStats stats) {
    Debug.Log($"Left channel with duration {stats.duration}s...");
}

void OnUserEnableLocalVideo(uint uid, bool enabled) {
    Debug.Log($"{uid} - {enabled}");
}

// On Error callback [AGORA]
void OnError(int error, string msg) {
    Debug.Log($"Error with agora:{(msg != "" ? $" {msg}\n" : " ")}(ERROR ID: {error})");
}

void OnConnectionLostHandler() {
    Debug.Log(string.Format("Connection lost..."));
}
#endregion

void OnApplicationQuit() {
    Debug.Log("Application quitted...");
    if (rtcEngine != null) {
        rtcEngine.LeaveChannel();
        rtcEngine.DisableVideoObserver();
        IRtcEngine.Destroy();
    }
}
}

`

icywind commented 1 year ago

I don't think OnUserEnableLocalVideo is implemented on WebGL. Try subscribing to OnUserMuteVideo instead.

hoomoongoos commented 1 year ago

I've tried it and, to be honest, video-related callbacks are being super inconsistent right now. I've tried using "OnUserEnableLocalVideo", "OnUserMuteVideo" and "OnRemoteVideoStateChanged" to check if a user has enabled their video or not, and there have been times where those callbacks have been fired all the time correctly and other times they've just fired once on startup.

I'm at a point where they just stopped working completely, while I'm absolutely certain that other users local video is being enabled and disabled.

I really don't know what makes it work sometimes but most of the time they won't work at all.

icywind commented 1 year ago

@hoomoongoos our test show the callbacks work properly except it could be too much if all these events are registered. Underneath they corresponds to whether the user's stream is published or not. There isn't a really a one-to-one functional match to muteVideo vs. disableLocalVideo. Please bear in mind that as a wrapper, the WebGL SDK is micmicing the native does but they are not 1-to-1 in parity. I would suggest to use OnUserMuteVideo if you are using the RtcEngine instance or OnRemoteVideoStateChanged if you are using multichannel in your implementation, but not all of them. That said, we will try to implement OnUserEnableLocalVideo as the direct response to this issue.

nathanGroovy commented 1 year ago

I agree with @icywind 's previous comment. OnUserEnableLocalVideo hasn't been properly implemented and will need some work before it's functional. But OnUserMuteVideo does seem to work fine with the RtcEngine. I've been testing it myself. Here's how I'm using the callback: mRtcEngine.OnUserMuteVideo = userVideoMutedHandler; void userVideoMutedHandler(uint uid, bool muted) { logger.UpdateLog(string.Format("onUserMuteHandler uid: {0}, muted: {1}", uid, muted)); }

I'll be pushing the adjustments I made to the repo. But copy and paste over the following functions in clientManager.js to fix a bug where it's being called too many times:

handleUserPublished(): async handleUserPublished(user, mediaType) { const id = user.uid; if (this.audioSubscribing && mediaType == "audio" && (mediaType == "audio" && this.screenShareClient == null || mediaType == "audio" && this.screenShareClient != null && id != this.screenShareClient.uid)) { await this.subscribe_remoteuser(user, mediaType); } else if(this.videoSubscribing && mediaType == "video" && remoteUsers) { await this.subscribe_remoteuser(user, mediaType); this.getRemoteVideoStats(id); } remoteUsers[id] = user; }

handleUserUnpublished(): handleUserUnpublished(user, mediaType) { const id = user.uid; var strUID = id.toString(); }

this will remove the unnecessary event callback functions for muting when publishing and unpublishing. Let us know how it goes for you and if you're still having trouble with OnUserMuteVideo still.