Open KarthikRichie opened 3 years ago
Duplicate of #681 no? See also discussion with @AtosNicoS in #139. Are you continuing to get frames that are frozen (callback called with the same frame data), or is the rendering callback not called anymore? You can work around by setting a timer, and if the callback is not called for, say, 300ms, then assume stream stopped and clear to black the UI.
It's actually the other way around. I'm sharing my screen (in UWP) to remote web client and
WebRTCFrameCallback(in FrameRequest frameRequest)
is getting called and requesting me to send a screen captured to the remote end which I'm doing it by calling
frameRequest.CompleteRequest
by fetching the latest frame from my frame pool.
The moment when I stop my screen share, the following callback from mixed reality library is still called
WebRTCFrameCallback(in FrameRequest frameRequest)
but I've no frames to send. This is resulting in a frozen state in the remote end. I was just wondering if there is way to gracefully let the remote end know that I've stopped my screenshare.
To stop my screenshare, I'm setting localvideotrack attached to the videoTransceiver to null
You're not stopping the video track source, you're only detaching the source from the transceiver. So the source is still live and continues to produce frames (and therefore the callback is called). Those frames are then discarded because there's no track connected to the source. You need to destroy the ExternalVideoTrackSource once the track is detached, and recreate it next time you need it to restart screen share. You cannot "pause" a video track source; the Google internal design doesn't allow that.
Note that on the remote side the issue after detaching the track is exactly what's #681 is about.
@KarthikRichie
Hey,
would it be possible for you to give me a little information on how you realised your screen capturing? Because I am currently facing exactly this problem. I'm using GraphicsCapturePicker
for screen capturing and what I get is a Direct3D11CaptureFrame
.
I don't know how to get the connection between my Direct3D11CaptureFrame
and those DeviceVideoTrackSource
and I420AVideoFrame|Argb32VideoFrame
. But I found your post and see that you used ExternalVideoTrackSource
for that. I also created a post about that.
@arsi0001 , I've added the solution here.
Describe the bug
I'm using the following code in webrtc 2.0.2 to share my screen (UWP app) to a remote user (using Web) using ExternalVideoTrackSource
To stop my screenshare, I'm setting localvideotrack attached to the videoTransceiver to null
The above code is stopping the screen capture in my end as expected.
But, the
is getting called but I've no frames to send as the screen share is stopped. Is there a way to let the remote end know that frames will be stopped from sending? At the remote end, the captured screen is frozen. Ideally, the expectation is to clear the screen in the remote. I checked the behavior from android/iOS apps (that uses webrtc) to the same client that the remote user is using, but it works as expected (ie:- clearing the captured screen at remote call on stopping screen capture). Only for screens shared from UWP using MR webrtc 2.0.2, I'm facing this issue. Any thoughts? I'm I missing something