Open limbouser opened 4 years ago
So this plugin works using platform channels & platform views.
The dart code is just a small wrapper. When you call OmxplayerVideoPlayer.create
, what is actually being invoked is this method in the native code.
The cool thing about omxplayer is that it has a DBus-interface. That is what flutter-pi is using to control omxplayer.
In on_create
, flutter-pi launches a new thread just for controlling omxplayer (called the "omxplayer manager thread") (Source here). That thread does these things, in order:
These tasks are scheduled by the platform message callbacks like on_pause
, on_set_volume
etc, which are in turn invoked by the dart code inside OmxplayerVideoPlayer
.
When omxplayer is launched, it's launched with size 1x1 and --layer -128
so it's hopefully invisible. It only is set visible when its view (built using OmxplayerVideoPlayer.buildView
) is added to the widget tree.
The key part of the Widget returned by buildView
is a PlatformView
. (See the link on top for an explanation.) When that PlatformView is rendered, the flutter engine basically tells flutter-pi "this platform view has coordinates (x, y) and size (w, h)" (and some other stuff). The rendering code then updates the view of omxplayer using these dimensions given by the flutter engine by scheduling an "update view" task to the omxplayer manager thread.
There's one thing left unexplained, how it's possible that some UI is rendered below omxplayer, but also some UI above omxplayer. When flutter renders a platform view, it also splits all the graphics into 3 layers (explanation here):
Using hardware planes, flutter-pi can then render layer 1 below omxplayer and render layer 3 above omxplayer.
More precisely, each UI layer gets its own hardware plane to render into. Flutter-pi can then change the zpos
(which is what omxplayer lets you specify using the --layer
option) of these planes, so that one of these planes is above omxplayer and one below.
This is all very complicated, and I hope I was able to explain it at least somewhat. Another alternative to this was to use gstreamer to render the video into an OpenGL texture, but I did some testing and gstreamer didn't seem to support Raspberry Pi that well. The way it's implemented now (using omxplayer and hardware planes) is kinda nice too though. I haven't compared it, but at least according to this article hardware planes should be faster for video playback.
To be very frank, I am kind of new to all this stuff, but your explanation has helped me get an idea about how the communication between the player and Flutter is happening. I have a player which I am using to play video and everything and I have launched it, performed every operation on the player using Dart. Now what I need to do is to make it properly fit inside a Flutter app code. Right now when I am playing a video, the playing is happening under all the other layers. I couldn't properly understand the part where you explained about the layers and everything. My doubt is,
Again thank you for the great explanation you have given me. 👍
There are some online resources about linux low-level graphics, see here for example.
I think it's impossible to make some part of the app UI invisible without direct OpenGL access. If you render some invisible widget, you'll just see the widgets behind that widget instead of your video.
It's best to just write some native code to do that. That's the reason why flutter explicitly supports native code - for lower level, platform specific access. I don't see a way to do it without native code either.
The best solution would be rendering to texture, I think. That allows flutter to mutate the view however it wants, rotate by arbitrary degrees, crop it, colorfilter it, etc. The cost of that would maybe be memory bandwidth, but I'm not sure that's noticeable on the Raspberry Pi. This means your player would need to support outputting into either:
Platform Views are kinda hacky, and should be used only when necessary (like it was in my case since gstreamer wasn't stable enough)
Hey, this definitely looks great, one can play videos without any change in the app code using this. I am greatly interested to know how the dart files are passing the request to the omxplayer and how the video playback is happening. I am greatly curious about this thing. Can you please give me some basic information about the same?