Closed PatrickSCLin closed 8 years ago
Win2D does not directly support YUV data formats. If you are willing to interop and include some native C++ code, you have several options:
my application is windows 10 universal app, is option 2 more easier and better option for me ?
Option 2 is probably the easiest, but these are all somewhat equivalent in that they require stepping outside of Win2D and writing some C++ D2D code to use alongside it.
which D3D11_BIND_FLAG for this purpose ?
I haven't actually tried this, but I'd expect these need to be shader resource (the same as to use a texture with D3D).
How to interop ID2D1ImageSource to CanvasVirtualBitmap?
I have already created ID2D1ImageSource but no idea how to do the last step.
I just commit the code on github https://github.com/PatrickSCLin/YUVSample
Interop is documented here: http://microsoft.github.io/Win2D/html/Interop.htm
I just finish the interop, but the image which be drawn is not incorrect.
I guess the UV is wrong, I'm confused, since I have to interop Win2D & Direct2D
I should use the CanvasDrawSession and send it into C++ to do interop it to D2D1Context to do CreateImageSourceFromDxgi
or I should use the D2D1Device which I created in C++ and send it to C# , interop it to CanvasDevice ?
this sample is what I done for now, https://github.com/PatrickSCLin/YUVSample
maybe you give me some hint from the code
I can draw YUV frame on win2d now, but ...
does option 2 only based on NV12 format ?
I mean, most h264 file is decoded to be yuv420p frame from FFmpeg,
but position of UV is kind different with NV12, can I use yuv420p format directly in option 2?
thanks for help :)
I just have to do a little function to swap UV by myself, and it display perfect.
Great to hear you got this working! How is the performance?
The incoming surfaces for CreateImageSourceFromDxgi don't have to be NV12. https://msdn.microsoft.com/en-us/library/windows/desktop/dn890791%28v=vs.85%29.aspx has a table documenting the various options - input can either be a single planar format texture, or a pair of textures containing separate Y and UV data.
https://msdn.microsoft.com/en-us/library/windows/desktop/dn890791(v=vs.85).aspx
sorry, I'm trying to put yuv420p frame data directly to a CanvasVirturalBitmap by CreateImageSourceFromDxgi
since yuv420p is a planlar format, so in this case, I should create 3 textures for Y, U, V, all of them are R8 format, right ?
also I saw the document said
HRESULT CreateImageSourceFromDxgi(
[in] IDXGISurface **surfaces,
[in, optional] const D2D1_RECT_U *sourceRectangles,
UINT32 surfaceCount,
DXGI_COLOR_SPACE_TYPE colorSpace,
D2D1_IMAGE_SOURCE_FROM_DXGI_OPTIONS options,
[out] ID2D1ImageSource **imageSource
);
sourceRectangles [in, optional]
Type: const D2D1_RECT_U*
The regions of the surfaces to create the image source from.
the problem is, I can not see the argument sourceRectagles in my ID2D1DeviceContext2 interface
the function in my VisualStudio 2015 Update 1 is looks like this, and no other overload function:
HRESULT CreateImageSourceFromDxgi(
[in] IDXGISurface **surfaces,
UINT32 surfaceCount,
DXGI_COLOR_SPACE_TYPE colorSpace,
D2D1_IMAGE_SOURCE_FROM_DXGI_OPTIONS options,
[out] ID2D1ImageSource **imageSource
);
what should I do ?
the option3 only work when UV is packet format, right ?
I'm using option2 for my project now, if the decoder runs well,
it should be fine to draw in 30fps, 4000 * 3000, on CPU i5 4460
thanks for help :)
I'm developing a video streaming app via Win2D and FFmpeg,
cus Win2D just work in D2D format, so I have to do sws_scale,
a worst case like a big resolution 4000 x 3000 h264 streaming
it's really too slow if convert it from yuv to bgra in FFmpeg
I just noticed a test "Direct3DSurfaceInteropTests",
so I don't know the detail, but is possible that I bind yuv data to a d3d surface, and convert yuv to bgra in GPU or even better if Win2D can draw the surface directly