microsoftgraph / microsoft-graph-comms-samples

Microsoft Graph Communications Samples
MIT License
207 stars 225 forks source link

Video Playback through main video or screen share. #762

Open ebade opened 2 weeks ago

ebade commented 2 weeks ago

Describe the issue I am attempting to record the screen share from a participant then after stopping the recording I want to playback what was recorded through the bot into the meeting.

Code Snippet In the bot initialization: `AudioVideoFramePlayerSettings settings = new AudioVideoFramePlayerSettings(new AudioSettings(20), new VideoSettings(), 1000);

framePlayer = new AudioVideoFramePlayer((AudioSocket)this.audioSocket, (VideoSocket)this.mainVideoSocket, settings);`

Pipe buffer from the screen share into a list of buffers: `List screenShareBufferList = new List();

public void recordScreenShare(VideoMediaBuffer buffer) { if (shouldRecordScreenShare) { //var newVideo = new VideoSendBuffer(buffer.Data, buffer.Length, VideoFormat.NV12_1920x1080_15Fps, buffer.Timestamp); screenShareBufferList.Add(buffer); } }`

Play the recorded video mostly taken from some other example to try and make it work: `private const double TicksInOneMs = 10000.0; private const double MsInOneSec = 1000.0;

public async Task PlayScreenRecord() { var videoMediaBuffers = new List(); var referenceTime = DateTime.Now.Ticks;

var packetSizeInMs = (long)((MsInOneSec/  VideoFormat.NV12_1920x1080_15Fps.FrameRate) * TicksInOneMs);

foreach (var buffer in screenShareBufferList)
{

    /*nint data = buffer.Data;
    IntPtr unManageBuffer = Marshal.AllocHGlobal(data);*/
    var videoBuffer = new VideoSendBuffer(buffer.Data, buffer.Length, VideoFormat.NV12_1920x1080_15Fps, referenceTime);
    videoMediaBuffers.Add(videoBuffer);
    referenceTime += packetSizeInMs;
}
await SendFrameVideo(videoMediaBuffers);
//await SendFrameVideo(screenShareBufferList);

}`

The frame player, I am not sending audio right now, but I feel like this should not be a factor: `public async Task SendFrameVideo(List buffer) { try {

    await framePlayer.EnqueueBuffersAsync(null, buffer);
}
catch (Exception ex)
{
    this.logger.Error(ex, $"[OnVideoReceived] Exception while calling SendFrameVideo");
}

}`

Expected behavior I expect to have video play over the screen share. Currently when I play it will terminate my local debugging session and provides no errors in the try catch block.

Any help would be appreciated, I am not doing this too far away from how the audio is being handled and I can play audio over the meeting. If I attempt to use the vbss socket, I lose all sharing with the bot and still have the same issue of it crashing the app.

InDieTasten commented 2 weeks ago

Not sure I fully understand your code, but it looks like you are not saving the buffer properly. You only keep references to buffers which are then dropped. Hard to tell though.

ebade commented 2 weeks ago

Not sure I fully understand your code, but it looks like you are not saving the buffer properly. You only keep references to buffers which are then dropped. Hard to tell though.

I am using the variable "screenShareBufferList" to keep the buffers when in the 'recording' mode then when I attempt to send the video I am trying to recreate the video buffers to strip away the original format and things because I originally got errors when attempting to send over the video pipeline.

ebade commented 1 week ago

So for the most part I am able to get a single frame to show up on the media pipeline. But when I perform the working code it is not storing the data and seems to just grab the latest copy of the image and show it. When I debug it looks like the pointer is being reference by each of my list items. I am not super familiar with how video images are handled, but it looks to be a series of bytes, but when I try and copy the bytes they seem to just be overridden. The code is as follows:

This is to get a new copy of the buffer and load it into a new VideoSendBuffer object, in theory I would think it is a new copy of the frame. `byte[] newBuf = new byte[buffer.Length];

Marshal.Copy(buffer.Data, newBuf, 0, (int)buffer.Length);

var videoBuffer = new VideoSendBuffer(newBuf, (uint)buffer.Length, VideoFormat.NV12_1920x1080_15Fps);

screenShareBufferList.Add(videoBuffer);`

then I send these objects to the frame player but all I see if the latest image from the screen share. When I debug the list of objects they all seem to point to the same pointer, which makes no sense since I made new objects with what I thought was a new copy of the byte array.

tedlatte commented 1 week ago

Hey ebade, if you're working on recording and playing back screen share video buffers through a bot, you might want to check out the Recall.ai API.

It’s a simple 3rd party API that lets you use meeting bots to get raw audio/video/metadata from meetings without you needing to spend months to build, scale and maintain these bots.

Here are the API docs: https://docs.recall.ai