Closed RookieDeveloper-Wang closed 3 years ago
Hi @RookieDeveloper-Wang, I have never worked with Unity or Hololens, so I cannot help much. It is certainly possible to integrate Unity with a XAML/UWP UI. Check these links:
https://github.com/microsoft/Unity-Xaml-Sample https://docs.unity3d.com/Manual/UnityasaLibrary-Windows.html
You can check our samples to see how to get FFmpegInteropMSS running in a XAML UI. So once you have a XAML UI running in your Unity app, it should be easy to add FFmpegInterop into it.
But I think that this approach only allows you to overlay a XAML UI over Unity app. If you want to be more flexible, and put video image into your 3D model, it should be possible to directly render the video frames into Unity. You can configure Unity to use D3D11 as rendering engine, then it should be possible to use MediaPlayer in frame server mode, use our FFmpegInteropMSS with it, and render the resulting D3D11 texture from MediaPlayer directly in your actual app scene. It might be complicated to get this working, and I cannot really help you with this, but as I see it, I think it should be possible.
Hi @RookieDeveloper-Wang, I have never worked with Unity or Hololens, so I cannot help much. It is certainly possible to integrate Unity with a XAML/UWP UI. Check these links:
https://github.com/microsoft/Unity-Xaml-Sample https://docs.unity3d.com/Manual/UnityasaLibrary-Windows.html
You can check our samples to see how to get FFmpegInteropMSS running in a XAML UI. So once you have a XAML UI running in your Unity app, it should be easy to add FFmpegInterop into it.
But I think that this approach only allows you to overlay a XAML UI over Unity app. If you want to be more flexible, and put video image into your 3D model, it should be possible to directly render the video frames into Unity. You can configure Unity to use D3D11 as rendering engine, then it should be possible to use MediaPlayer in frame server mode, use our FFmpegInteropMSS with it, and render the resulting D3D11 texture from MediaPlayer directly in your actual app scene. It might be complicated to get this working, and I cannot really help you with this, but as I see it, I think it should be possible.
Do you mean I need a server to do the transfer? I think Can FFmpegInteropX be directly integrated into Unity?
Thank you anyway~
FFmpegInteropX can be used with the MediaPlayer class for playback. There are two ways to use MediaPlayer: Normally, you put it into a XAML MediaPlayerElement in a XAML UI. You can use this in Unity as well, when you follow the XAML overlay approach. But AFAIK, this only allows you to overlay a "flat" XAML UI over your Unity Window. The other way is the so called frame server mode of MediaPlayer. In that mode, MediaPlayer will raise an event whenever a new frame must be drawn. The event contains a D3D11 texture of the frame. You can draw that texture wherever you like, using D3D11. A XAML UI is not needed in that mode. When you switch Unity to use D3D11, then I think it should be possible to draw that texture directly onto your Unity 3D objects, in 3D space. You do not need a separate server for frame server mode, it is just called like that.
There's a sample for using frame server mode:
i personally tried it and it has pretty impressive performance, but there's lots of gotchas to handle, especially when it comes to the complicated lifecycle of the UWP app. You can render both video and subtitles on demand.
What should I do,thank you