microsoft / MixedReality-WebRTC

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience
https://microsoft.github.io/MixedReality-WebRTC/
MIT License
913 stars 283 forks source link

On stopping the screen share, Frames captured using ExternalVideoTrackSource is frozen at remote end. #686

Open KarthikRichie opened 3 years ago

KarthikRichie commented 3 years ago

Describe the bug

I'm using the following code in webrtc 2.0.2 to share my screen (UWP app) to a remote user (using Web) using ExternalVideoTrackSource

_videoSrc = ExternalVideoTrackSource.CreateFromArgb32Callback(Screenshare.Instance.WebRTCFrameCallback); _localVideoTrack = LocalVideoTrack.CreateFromSource(_videoSrc , new LocalVideoTrackInitConfig() { trackName = "screenshare" }); _videoTransceiver = _Connection.AddTransceiver(MediaKind.Video, new TransceiverInitSettings() { InitialDesiredDirection = Transceiver.Direction.SendReceive, StreamIDs = new List() { "local_av_stream" } }); _videoTransceiver.LocalVideoTrack = _localVideoTrack;

To stop my screenshare, I'm setting localvideotrack attached to the videoTransceiver to null

_videoTransceiver.LocalVideoTrack = null; _localVideoTrack = null;

The above code is stopping the screen capture in my end as expected.

But, the

WebRTCFrameCallback(in FrameRequest frameRequest)

is getting called but I've no frames to send as the screen share is stopped. Is there a way to let the remote end know that frames will be stopped from sending? At the remote end, the captured screen is frozen. Ideally, the expectation is to clear the screen in the remote. I checked the behavior from android/iOS apps (that uses webrtc) to the same client that the remote user is using, but it works as expected (ie:- clearing the captured screen at remote call on stopping screen capture). Only for screens shared from UWP using MR webrtc 2.0.2, I'm facing this issue. Any thoughts? I'm I missing something

djee-ms commented 3 years ago

Duplicate of #681 no? See also discussion with @AtosNicoS in #139. Are you continuing to get frames that are frozen (callback called with the same frame data), or is the rendering callback not called anymore? You can work around by setting a timer, and if the callback is not called for, say, 300ms, then assume stream stopped and clear to black the UI.

KarthikRichie commented 3 years ago

It's actually the other way around. I'm sharing my screen (in UWP) to remote web client and

WebRTCFrameCallback(in FrameRequest frameRequest)

is getting called and requesting me to send a screen captured to the remote end which I'm doing it by calling

frameRequest.CompleteRequest

by fetching the latest frame from my frame pool.

The moment when I stop my screen share, the following callback from mixed reality library is still called

WebRTCFrameCallback(in FrameRequest frameRequest)

but I've no frames to send. This is resulting in a frozen state in the remote end. I was just wondering if there is way to gracefully let the remote end know that I've stopped my screenshare.

djee-ms commented 3 years ago

To stop my screenshare, I'm setting localvideotrack attached to the videoTransceiver to null

You're not stopping the video track source, you're only detaching the source from the transceiver. So the source is still live and continues to produce frames (and therefore the callback is called). Those frames are then discarded because there's no track connected to the source. You need to destroy the ExternalVideoTrackSource once the track is detached, and recreate it next time you need it to restart screen share. You cannot "pause" a video track source; the Google internal design doesn't allow that.

djee-ms commented 3 years ago

Note that on the remote side the issue after detaching the track is exactly what's #681 is about.

Mt-Perazim commented 3 years ago

@KarthikRichie Hey, would it be possible for you to give me a little information on how you realised your screen capturing? Because I am currently facing exactly this problem. I'm using GraphicsCapturePicker for screen capturing and what I get is a Direct3D11CaptureFrame. I don't know how to get the connection between my Direct3D11CaptureFrameand those DeviceVideoTrackSourceand I420AVideoFrame|Argb32VideoFrame. But I found your post and see that you used ExternalVideoTrackSourcefor that. I also created a post about that.

KarthikRichie commented 3 years ago

@arsi0001 , I've added the solution here.