Open martingra opened 3 years ago
Solved by changing the WebRtcSession class
public class WebRtcSession : WebSocketBehavior
{
public PeerConnection pc { get; private set; }
public event Action<WebRtcSession, string> MessageReceived;
public WebRtcSession()
{
pc = new PeerConnection();
pc.VideoTrackAdded += (RemoteVideoTrack track) =>
{
track.Argb32VideoFrameReady += (Argb32VideoFrame frame) =>
{
var width = frame.width;
var height = frame.height;
var stride = frame.stride;
var data = frame.data;
System.Drawing.Bitmap bmpImage = new System.Drawing.Bitmap((int)width, (int)height, (int)stride, System.Drawing.Imaging.PixelFormat.Format32bppArgb, data);
};
};
}
protected override void OnMessage(MessageEventArgs e)
{
MessageReceived(this, e.Data);
}
}
I think it would be helpful to include this updated working example with MixedReality-WebRTC 2.0.2 version.
Hello, I tried your changes but I don't understand the way your bitmap is copied in the PictureBox. Don't you have other changes ?
Sorry for the delay, I was on vacations. I didn't test how to render on the PictureBox, sorry. I was just trying to get the bitmaps.
Ok, Thanks for the answer.
I am trying to run the TestReceiveAV example, using the version 2.0.2 of MixedReality-WebRTC available via NuGet.
I changed the code as follows (original code is commented out)
I am getting an empy form, where
session.pc.VideoTrackAdded
is never called.Is that the correct way to capture the received frames from the web client?