Closed NandishNR closed 1 year ago
It depends on which method you use to render video, but you can use a standard Win32 child control, like a Static that I used for example in MainWindow.xaml.cs to render Webcam preview with DirectShow (_m_hWndContainer = CreateWindowEx(WS_EXLAYERED, "Static"...)
Requirement is to render videos from multiple user in grid layout (kind of 2x2, 3x3, etc.. ). Above solution is possible only for one user video using window handler and that too it displays entire window. Also, in layout design usercontrol/page is used and window can't be placed over.
Earlier, layout is designed with below code and multiple instance is created of it to render multiple user's video.
<WindowsFormsHost x:Name="webCamHost" HorizontalAlignment="Stretch" VerticalAlignment="Stretch">
<winForms:Panel x:Name="WebCamPreview" Dock="Fill"/>
</WindowsFormsHost>
You mean that you have a grid with several videos inside ? But which method is used to display Webcam previews ? (there is DirectShow, Microsoft Media Foundation, MediaCapture, ...)
Winforms panel defined like below and we dynamically create multiple views based layout selected like grid/active speaker
Sample rendering
We finally give panel handle id to media SDK to render video onto each view. SDK reference below.
I have actually been having a tough time writing a comment to reply to this. There is a lot of things going on here and I honestly hate not giving all related information. But I'm going to try to keep it simple for now.
WinUI 3 uses the same kind of architecture as UWP Xaml. This means that besides the top level window, there are no HWNDs. All the content is drawn using the Windows composition engine. So unless that library offers a way to target a DXGI swap chain, the new composition swap chain, or allow access to things through the UWP media player then I seriously doubt you will be able to use this library with WinUI 3.
Microsoft has been moving towards making things more Direct3D based, so I expect for this to get worse as time goes on.
I have actually been having a tough time writing a comment to reply to this. There is a lot of things going on here and I honestly hate not giving all related information. But I'm going to try to keep it simple for now.
WinUI 3 uses the same kind of architecture as UWP Xaml. This means that besides the top level window, there are no HWNDs. All the content is drawn using the Windows composition engine. So unless that library offers a way to target a DXGI swap chain, the new composition swap chain, or allow access to things through the UWP media player then I seriously doubt you will be able to use this library with WinUI 3.
Microsoft has been moving towards making things more Direct3D based, so I expect for this to get worse as time goes on.
@DarranRowe - conclusion of the reply is that we can't have an option to access handler of the control un WinUI3? like @castorix replied - only window provide handler?
Yes, WinUI 3 controls use the composition engine's Visual here. There is no HWND available for that.
I don't know the method you use, but if you have access to a video as an array of bytes, you could use SwapChainPanels.
@castorix - we have array of bytes in YUV420 video format, please let us know which control suits for us. we just wanted to bind stream as source as we need to avoid processing time so that no lag in video. something like - https://learn.microsoft.com/en-us/uwp/api/windows.ui.xaml.controls.mediaelement.setsource?view=winrt-22621
@castorix - we have array of bytes in YUV420 video format, please let us know which control suits for us. we just wanted to bind stream as source as we need to avoid processing time so that no lag in video. something like - https://learn.microsoft.com/en-us/uwp/api/windows.ui.xaml.controls.mediaelement.setsource?view=winrt-22621
As I said, with SwapChainPanel, with Direct2D for Webcam images (ID2D1Bitmap::CopyFromMemory to get a D2D1 Bitmap from bytes array (it is fast as it uses GPU)
I quickly tested with IMFCaptureEngine and it works once I could get a bytes array (with IMFMediaBuffer)
(with Direct2D.cs I posted recently, but there was a bug , ref was missing at
void CopyFromMemory(ref D2D1_RECT_U dstRect, IntPtr srcData, uint pitch);
)
Also, from an array of bytes, a simple Image control can be used with a WriteableBitmap as source (WriteableBitmap.PixelBuffer.AsStream().WriteAsync to fill it with bytes array)
@castorix - Sorry, not following your reply completely. We need to display video (ours's video conferencing product). we continuously get video fps ranging from 10 - 15. We need to simply play that video with-out much processing.
please suggest solution for bytes to stream and suggested control can be used
You said you receive bytes array I posted 2 controls which can display bytes arrays from a Webcam, SwapChainPanel or Image (with WriteableBitmap to convert bytes) I tested 2 SwapChainPanels with a Webcam (I have only a Virtual Webcam to test), it works like with 1 SwapChainPanel, although I have some horizontal black lines sometimes (I probably let a bug somewhere...) :
I made more tests and it is a lot simpler with an Image control
From a bytes array that I got from Webcam (pBytesArray variable, _mw is a MainWindow with img1 Image control, _mnWidth & _mnHeight are default Webcam image dimensions (640*480 for my virtual Webcam)), I can do :
WriteableBitmap m_WriteableBitmapCapture = null;
// To avoid RPC_E_WRONG_THREAD
bool isQueued = _mw.DispatcherQueue.TryEnqueue(Microsoft.UI.Dispatching.DispatcherQueuePriority.Normal, async () =>
{
if (m_WriteableBitmapCapture != null)
m_WriteableBitmapCapture.Invalidate();
m_WriteableBitmapCapture = new WriteableBitmap((int)m_nWidth, (int)m_nHeight);
await m_WriteableBitmapCapture.PixelBuffer.AsStream().WriteAsync(pBytesArray, 0, pBytesArray.Length);
_mw.img1.Source = m_WriteableBitmapCapture;
});
Thanks for sample code. @castorix - how do we render video instead of image?
how do we render video instead of image?
I receive the bytes array in an event (tested with IMFCaptureEngineOnSampleCallback::OnSample,
so it displays a video (a succession of images) in the Image control
What is the format of the video you are using? Is it possible to have a similar result for YUV data since from my understanding Windows cannot directly convert it. Basically if I receive video frame data in YUV format, how can we convert it to WritableBitmap and display in the Image control while preserving the color format? From what I know, WriteableBitmap accepts only RGB, CMYK and B&W formats
You can convert YUV to RGB or other format with formulas from MSDN/Google, like Recommended 8-Bit YUV Formats for Video Rendering yuv2rgb.c How to use YUV (YUV_420_888) Image in Android
I am able to convert to RGB format from YUV and render the image, but converting it to RGB and then as image source (like BitmapImage/WriteableBitmap) can be slow right? In case of real-time video feeds, won't there be a visible delay in receiving and rendering the data? In a video conferencing product, we would like to reduce this delay as much as possible since there will be multiple users with video feeds at the same time. I there a better approach for this?
In my tests, it is fast as I used MF_CAPTURE_ENGINE_D3D_MANAGER, so it uses GPU But I don't know about your method, if it can use DirectX surfaces to use hardware acceleration...
This issue is stale because it has been open 180 days with no activity. Remove stale label or comment or this will be closed in 5 days.
In WPF project, we have hosted winforms panel using WindowsFormsHost and used to render live video using handler of panel. With below code and rest is handled in *.cs file.
<WindowsFormsHost x:Name="webCamHost" HorizontalAlignment="Stretch" VerticalAlignment="Stretch">
<winForms:Panel x:Name="WebCamPreview" Dock="Fill"/>
</WindowsFormsHost>
Now hosting winforms control is not possible in WinUI3 and want to know the control with handler in it to render the live video in WinUI3.