media-kit / media-kit

A cross-platform video player & audio player for Flutter & Dart.
https://github.com/media-kit/media-kit
MIT License
893 stars 126 forks source link
android audio audio-player c cpp dart flutter hacktoberfest ios java libmpv linux macos media-player obj-c swift video video-player web windows

package:media_kit

A cross-platform video player & audio player for Flutter & Dart.

Github Actions


Sponsored with 💖 by

Stream Chat



Try the Flutter Chat tutorial



Stream Chat



Clever Apps for Film Professionals

Installation

package:media_kit is split into multiple packages to improve modularity & reduce bundle size.

For apps that need video playback:

dependencies:
  media_kit: ^1.1.10                             # Primary package.
  media_kit_video: ^1.2.4                        # For video rendering.
  media_kit_libs_video: ^1.0.4                   # Native video dependencies.

For apps that need audio playback:

dependencies:
  media_kit: ^1.1.10                             # Primary package.  
  media_kit_libs_audio: ^1.0.4                   # Native audio dependencies.

Notes:

Platforms

Platform Video Audio Notes Demo
Android ✅ ✅ Android 5.0 or above. Download
iOS ✅ ✅ iOS 9 or above. Download
macOS ✅ ✅ macOS 10.9 or above. Download
Windows ✅ ✅ Windows 7 or above. Download
GNU/Linux ✅ ✅ Any modern GNU/Linux distribution. Download
Web ✅ ✅ Any modern web browser. Visit
Android iOS
Android Android iOS iOS
macOS Windows
macOS Windows
GNU/Linux Web
GNU/Linux Web

TL;DR

A quick usage example.

import 'package:flutter/material.dart';

// Make sure to add following packages to pubspec.yaml:
// * media_kit
// * media_kit_video
// * media_kit_libs_video
import 'package:media_kit/media_kit.dart';                      // Provides [Player], [Media], [Playlist] etc.
import 'package:media_kit_video/media_kit_video.dart';          // Provides [VideoController] & [Video] etc.        

void main() {
  WidgetsFlutterBinding.ensureInitialized();
  // Necessary initialization for package:media_kit.
  MediaKit.ensureInitialized();
  runApp(
    const MaterialApp(
      home: MyScreen(),
    ),
  );
}

class MyScreen extends StatefulWidget {
  const MyScreen({Key? key}) : super(key: key);
  @override
  State<MyScreen> createState() => MyScreenState();
}

class MyScreenState extends State<MyScreen> {
  // Create a [Player] to control playback.
  late final player = Player();
  // Create a [VideoController] to handle video output from [Player].
  late final controller = VideoController(player);

  @override
  void initState() {
    super.initState();
    // Play a [Media] or [Playlist].
    player.open(Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'));
  }

  @override
  void dispose() {
    player.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Center(
      child: SizedBox(
        width: MediaQuery.of(context).size.width,
        height: MediaQuery.of(context).size.width * 9.0 / 16.0,
        // Use [Video] widget to display video output.
        child: Video(controller: controller),
      ),
    );
  }
}

Note: You may need to add required permissions to your project (only if required).

Guide

A usage guide for package:media_kit.

Tip: Use Ctrl + F to quickly search for things.

Contents

Initialization

MediaKit.ensureInitialized must be called before using the package:

void main() {
  WidgetsFlutterBinding.ensureInitialized();
  // Make sure to add the required packages to pubspec.yaml:
  // * https://github.com/media-kit/media-kit#installation
  // * https://pub.dev/packages/media_kit#installation
  MediaKit.ensureInitialized();
  runApp(const MyApp());
}

The method also has some optional arguments to customize the global behavior. To handle any initialization errors, this may be surrounded by try/catch.

Create a Player

A Player instance is used to start & control the playback of a media source e.g. URL or file.

final Player player = Player();

Additional options may be provided using the configuration argument in the constructor. In general situations, you will never require this.

final Player player = Player(
  configuration: PlayerConfiguration(
    // Supply your options:
    title: 'My awesome package:media_kit application',
    ready: () {
      print('The initialization is complete.');
    },
  ),
);

Dispose a Player

It is extremely important to release the allocated resources back to the system:

await player.dispose();

Open a Media or Playlist

A Playable can either be a Media or a Playlist.

Use the Player.open method to load & start playback.

Media

final playable = Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4');
await player.open(playable);

Playlist

final playable = Playlist(
  [
    Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373709-603a7a89-2105-4e1b-a5a5-a6c3567c9a59.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373716-76da0a4e-225a-44e4-9ee7-3e9006dbc3e3.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373718-86ce5e1d-d195-45d5-baa6-ef94041d0b90.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373720-14d69157-1a56-4a78-a2f4-d7a134d7c3e9.mp4'),
  ],
);
await player.open(playable);

Notes:

  1. By default, this will automatically start playing the playable. This may be disabled as follows:
await player.open(
  playable,
  play: false,
);
  1. By default, the playlist will start at the index 0. This may be changed as follows:
final playable = Playlist(
  [
    Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373709-603a7a89-2105-4e1b-a5a5-a6c3567c9a59.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373716-76da0a4e-225a-44e4-9ee7-3e9006dbc3e3.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373718-86ce5e1d-d195-45d5-baa6-ef94041d0b90.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373720-14d69157-1a56-4a78-a2f4-d7a134d7c3e9.mp4'),
  ],
  // Declare the starting position.
  index: 0,
);
await player.open(playable);

Play, pause or play/pause

The 3 methods are:

await player.play();
await player.pause();
await player.playOrPause();

Stop

The stop method may be used to stop the playback of currently opened Media or Playlist.

await player.stop();

It does not release allocated resources back to the system (unlike dispose) & Player still stays usable.

Seek

Supply the final position to Player.seek method as Duration:

await player.seek(
  const Duration(
    minutes: 6,
    seconds: 9,
  ),
);

Loop or repeat

Three PlaylistModes are available:

await player.setPlaylistMode(PlaylistMode.single);

Set volume, rate or pitch

Set the volume

This controls the loudness of audio output. The maximum volume is 100.0.

await player.setVolume(50.0);

Set the rate

This controls the playback speed.

await player.setRate(1.5);

Set the pitch

This controls the pitch of the audio output.

await player.setPitch(1.2);

Note: This requires pitch argument to be true in PlayerConfiguration.

Handle playback events

You can access or subscribe to Player's state changes.

Event handling is an extremely important part of media playback. It is used to show changes in the UI, handle errors, detect the occurrence of play/pause, end-of-file, position updates etc.

A typical example will be:

player.stream.playing.listen(
  (bool playing) {
    if (playing) {
      // Playing.
    } else {
      // Paused.
    }
  },
);
player.stream.position.listen(
  (Duration position) {
    setState(() {
      // Update UI.
    });
  },
);

The following state(s) are available as events:

Type Name Description
Stream<Playlist> playlist Currently opened media sources.
Stream<bool> playing Whether playing or not.
Stream<bool> completed Whether end of currently playing media source has been reached.
Stream<Duration> position Current playback position.
Stream<Duration> duration Current playback duration.
Stream<double> volume Current volume.
Stream<double> rate Current playback rate.
Stream<double> pitch Current pitch.
Stream<bool> buffering Whether buffering or not.
Stream<Duration> buffer Current buffer position. This indicates how much of the stream has been decoded & cached by the demuxer.
Stream<PlaylistMode> playlistMode Current playlist mode.
Stream<AudioParams> audioParams Audio parameters of the currently playing media source e.g. sample rate, channels, etc.
Stream<VideoParams> videoParams Video parameters of the currently playing media source e.g. width, height, rotation etc.
Stream<double?> audioBitrate Audio bitrate of the currently playing media source.
Stream<AudioDevice> audioDevice Currently selected audio device.
Stream<List<AudioDevice>> audioDevices Currently available audio devices.
Stream<Track> track Currently selected video, audio and subtitle track.
Stream<Tracks> tracks Currently available video, audio and subtitle tracks.
Stream<int> width Currently playing video's width.
Stream<int> height Currently playing video's height.
Stream<int> subtitle Currently displayed subtitle.
Stream<PlayerLog> log Internal logs.
Stream<String> error Error messages. This may be used to handle & display errors to the user.

Shuffle the queue

You may find the requirement to shuffle the Playlist you open'd in Player, like some music players do.

await player.setShuffle(true);

Note: This option is reset upon the next Player.open call.

Use HTTP headers

Declare the httpHeaders argument in Media constructor. It takes the HTTP headers as Map<String, String>.

final playable = Media(
  'https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4',
  httpHeaders: {
    'Foo': 'Bar',
    'Accept': '*/*',
    'Range': 'bytes=0-',
  },
);

Use extras to store additional data with Media

The extras argument may be utilized to store additional data with a Media in form of Map<String, dynamic>.

final playable = Media(
  'https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4',
  extras: {
    'track': '9',
    'year': '2012',
    'title': 'Courtesy Call',
    'artist': 'Thousand Foot Krutch',
    'album': 'The End Is Where We Begin',
  },
);

Modify Player's queue

You can add or remove (etc.) a Media in an already playing Playlist:

Add

Add a new Media to the back of the queue:

await player.add(Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'));

Remove

Remove any item from the queue:

await player.remove(0);

Move

Move any item in the queue from one position to another:

await player.move(6, 9);

Go to next, previous or any other position in queue

Skip to the next queue item

await player.next();

Skip to the previous queue item

await player.previous();

Skip to any other queue item

await player.jump(5);

Select video, audio or subtitle track

A media source may contain multiple video, audio or subtitle tracks e.g. for multiple languages. Available video, audio or subtitle tracks are notified through Player's state. See "Handle playback events" section for related information.

By default, video, audio & subtitle track is selected automatically i.e. VideoTrack.auto(), AudioTrack.auto() & SubtitleTrack.auto().

Automatic selection

await player.setVideoTrack(VideoTrack.auto());

await player.setAudioTrack(AudioTrack.auto());

await player.setSubtitleTrack(SubtitleTrack.auto());

Disable track

This may be used to essentially disable video output, disable audio output or stop rendering of subtitles etc.

await player.setVideoTrack(VideoTrack.no());

await player.setAudioTrack(AudioTrack.no());

await player.setSubtitleTrack(SubtitleTrack.no());

Select custom track

List<VideoTrack> videos = player.state.tracks.video;
List<AudioTrack> audios = player.state.tracks.audio;
List<SubtitleTrack> subtitles = player.state.tracks.subtitle;

// Get notified as [Stream]:
player.stream.tracks.listen((event) {
  List<VideoTrack> videos = event.video;
  List<AudioTrack> audios = event.audio;
  List<SubtitleTrack> subtitles = event.subtitle;
});
await player.setVideoTrack(videos[0]);
await player.setAudioTrack(audios[1]);
await player.setSubtitleTrack(subtitles[2]);
VideoTrack video = player.state.track.video;
AudioTrack audio = player.state.track.audio;
SubtitleTrack subtitle = player.state.track.subtitle;

// Get notified as [Stream]:
player.stream.track.listen((event) {
  VideoTrack video = event.video;
  AudioTrack audio = event.audio;
  SubtitleTrack subtitle = event.subtitle;
});

Select audio device

Available audio devices are notified through Player's state. See "Handle playback events" section for related information.

By default, audio device is selected automatically i.e. AudioDevice.auto().

Default selection

await player.setAudioDevice(AudioDevice.auto());

Disable audio output

await player.setAudioDevice(AudioDevice.no());

Select custom audio device

List<AudioDevice> devices = player.state.audioDevices;

// Get notified as [Stream]:
player.stream.audioDevices.listen((event) {
  List<AudioDevice> devices = event;
});
await player.setAudioDevice(devices[1]);
AudioDevice device = player.state.audioDevice;

// Get notified as [Stream]:
player.stream.audioDevice.listen((event) {
  AudioDevice device = event;
});

Display the video

The existing "TL;DR example" should provide you better idea.

For displaying the video inside Flutter UI, you must:

The code is easier to understand:

class _MyScreenState extends State<MyScreen> {
  late final Player player = Player();
  late final VideoController controller = VideoController(player);

  @override
  void dispose() {
    player.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      body: Video(
        controller: controller,
      ),
    );
  }
}

The video playback uses hardware acceleration i.e. GPU by default.

Additional options may be provided using the configuration argument in the constructor. In general situations, you will never require this.

final VideoController player = VideoController(
  player,
  configuration: const VideoControllerConfiguration(
    // Supply your options:
    enableHardwareAcceleration: true,      // default: true
    width: 640,                            // default: null
    height: 480,                           // default: null
    // The in-code comments is best place to know more about these options:
    // https://github.com/media-kit/media-kit/blob/main/media_kit_video/lib/src/video_controller/video_controller.dart
  ),
);

Capture screenshot

The screenshot method takes the snapshot of the current video frame & returns encoded image bytes as Uint8List.

final Uint8List? screenshot = await player.screenshot();

Additionally format argument may be specified to change the encoding format. Following formats are supported:

Customize subtitles

SubtitleViewConfiguration can be passed to the Video widget for customizing the subtitles. The code is easier to understand:

Notably, TextStyle, TextAlign & EdgeInsetsGeometry can be provided.

Video(
  controller: controller,
  subtitleViewConfiguration: const SubtitleViewConfiguration(
    style: TextStyle(
      height: 1.4,
      fontSize: 24.0,
      letterSpacing: 0.0,
      wordSpacing: 0.0,
      color: Color(0xffffffff),
      fontWeight: FontWeight.normal,
      backgroundColor: Color(0xaa000000),
    ),
    textAlign: TextAlign.center,
    padding: EdgeInsets.all(24.0),
  ),
);

https://user-images.githubusercontent.com/28951144/253067794-73b5ca5d-e90d-4892-bc09-2a80f05c9f0b.mp4

Load external subtitle track

The SubtitleTrack.uri constructor can be used to load external subtitle track with URI e.g. SRT, WebVTT etc. The code is easier to understand:

await player.setSubtitleTrack(
  SubtitleTrack.uri(
    'https://www.iandevlin.com/html5test/webvtt/upc-video-subtitles-en.vtt',
    title: 'English',
    language: 'en',
  ),
);

The SubtitleTrack.data constructor can be used to load external subtitle track with data e.g. SRT, WebVTT etc. The code is easier to understand:

player.setSubtitleTrack(
  SubtitleTrack.data(
    '''WEBVTT FILE

1
00:00:03.500 --> 00:00:05.000 D:vertical A:start
Everyone wants the most from life

2
00:00:06.000 --> 00:00:09.000 A:start
Like internet experiences that are rich <b>and</b> entertaining

3
00:00:11.000 --> 00:00:14.000 A:end
Phone conversations where people truly <c.highlight>connect</c>

4
00:00:14.500 --> 00:00:18.000
Your favourite TV programmes ready to watch at the touch of a button

5
00:00:19.000 --> 00:00:24.000
Which is why we are bringing TV, internet and phone together in <c.highlight>one</c> super package

6
00:00:24.500 --> 00:00:26.000
<c.highlight>One</c> simple way to get everything

7
00:00:26.500 --> 00:00:27.500 L:12%
UPC

8
00:00:28.000 --> 00:00:30.000 L:75%
Simply for <u>everyone</u>
''',
    title: 'English',
    language: 'en',
  ),
);

Load external audio track

The AudioTrack.uri constructor can be used to load external audio track with URI. The code is easier to understand:

await player.setAudioTrack(
  AudioTrack.uri(
    'https://www.iandevlin.com/html5test/webvtt/v/upc-tobymanley.mp4',
    title: 'English',
    language: 'en',
  ),
);

Video controls

package:media_kit provides highly-customizable pre-built video controls for usage.

Apart from theming, layout can be customized, position of buttons can be modified, custom buttons can be created etc. Necessary features like fullscreen, keyboard shortcuts & swipe-based controls are also supported by default.

MaterialDesktopVideoControls MaterialVideoControls

Types

Type Description
AdaptiveVideoControls Selects MaterialVideoControls, CupertinoVideoControls etc. based on platform.
MaterialVideoControls Material Design video controls.
MaterialDesktopVideoControls Material Design video controls for desktop.
CupertinoVideoControls iOS-style video controls.
NoVideoControls Disable video controls i.e. only render video output.
Custom Provide custom builder for video controls.

Select existing video controls

Modify the controls argument. For advanced theming of existing video controls, see theming & modifying video controls section.

Scaffold(
  body: Video(
    controller: controller,
    // Select [MaterialVideoControls].
    controls: MaterialVideoControls,
  ),
);
Scaffold(
  body: Video(
    controller: controller,
    // Select [CupertinoVideoControls].
    controls: CupertinoVideoControls,
  ),
);

Build custom video controls

Pass custom builder Widget Function(BuildContext, VideoController) as controls argument.

Scaffold(
  body: Video(
    controller: controller,
    // Provide custom builder for controls.
    controls: (state) {
      return Center(
        child: IconButton(
          onPressed: () {
            state.widget.controller.player.playOrPause();
          },
          icon: StreamBuilder(
            stream: state.widget.controller.player.stream.playing,
            builder: (context, playing) => Icon(
              playing.data == true ? Icons.pause : Icons.play_arrow,
            ),
          ),
          // It's not necessary to use [StreamBuilder] or to use [Player] & [VideoController] from [state].
          // [StreamSubscription]s can be made inside [initState] of this widget.
        ),
      );
    },
  ),
);

Use & modify video controls

AdaptiveVideoControls
MaterialVideoControls
// Wrap [Video] widget with [MaterialVideoControlsTheme].
MaterialVideoControlsTheme(
  normal: MaterialVideoControlsThemeData(
    // Modify theme options:
    buttonBarButtonSize: 24.0,
    buttonBarButtonColor: Colors.white,
    // Modify top button bar:
    topButtonBar: [
      const Spacer(),
      MaterialDesktopCustomButton(
        onPressed: () {
          debugPrint('Custom "Settings" button pressed.');
        },
        icon: const Icon(Icons.settings),
      ),
    ],
  ),
  fullscreen: const MaterialVideoControlsThemeData(
    // Modify theme options:
    displaySeekBar: false,
    automaticallyImplySkipNextButton: false,
    automaticallyImplySkipPreviousButton: false,
  ),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
MaterialDesktopVideoControls
// Wrap [Video] widget with [MaterialDesktopVideoControlsTheme].
MaterialDesktopVideoControlsTheme(
  normal: MaterialDesktopVideoControlsThemeData(
    // Modify theme options:
    seekBarThumbColor: Colors.blue,
    seekBarPositionColor: Colors.blue,
    toggleFullscreenOnDoublePress: false,
    // Modify top button bar:
    topButtonBar: [
      const Spacer(),
      MaterialDesktopCustomButton(
        onPressed: () {
          debugPrint('Custom "Settings" button pressed.');
        },
        icon: const Icon(Icons.settings),
      ),
    ],
    // Modify bottom button bar:
    bottomButtonBar: const [
      Spacer(),
      MaterialDesktopPlayOrPauseButton(),
      Spacer(),
    ],
  ),
  fullscreen: const MaterialDesktopVideoControlsThemeData(),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
Shortcut Action
Media Play Button Play
Media Pause Button Pause
Media Play/Pause Button Play/Pause
Media Next Track Button Skip Next
Media Previous Track Button Skip Previous
Space Play/Pause
J Seek 10s Behind
I Seek 10s Ahead
Arrow Left Seek 2s Behind
Arrow Right Seek 2s Ahead
Arrow Up Increase Volume 5%
Arrow Down Decrease Volume 5%
F Enter/Exit Fullscreen
Escape Exit Fullscreen
CupertinoVideoControls
// Wrap [Video] widget with [CupertinoVideoControlsTheme].
CupertinoVideoControlsTheme(
  normal: const CupertinoVideoControlsThemeData(
    // W.I.P.
  ),
  fullscreen: const CupertinoVideoControlsThemeData(
    // W.I.P.
  ),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
NoVideoControls

Next steps

This guide follows a tutorial-like structure & covers nearly all features that package:media_kit offers. However, it is not complete by any means. You are free to improve this page & add more documentation, which newcomers may find helpful. The following places can help you learn more:

Goals

package:media_kit is a library for Flutter & Dart which provides video & audio playback.

You may see project's architecture & implementation details for further information.

The project aims to meet demands of the community, this includes:

  1. Holding accountability.
  2. Ensuring timely maintenance.

Supported Formats

A wide variety of formats & codecs are supported. Complete list may be found below:

``` 3dostr 3DO STR 4xm 4X Technologies aa Audible AA format files aac raw ADTS AAC (Advanced Audio Coding) aax CRI AAX ac3 raw AC-3 ace tri-Ace Audio Container acm Interplay ACM act ACT Voice file format adf Artworx Data Format adp ADP ads Sony PS2 ADS adx CRI ADX aea MD STUDIO audio afc AFC aiff Audio IFF aix CRI AIX alaw PCM A-law alias_pix Alias/Wavefront PIX image alp LEGO Racers ALP amr 3GPP AMR amrnb raw AMR-NB amrwb raw AMR-WB anm Deluxe Paint Animation apac raw APAC apc CRYO APC ape Monkey's Audio apm Ubisoft Rayman 2 APM apng Animated Portable Network Graphics aptx raw aptX aptx_hd raw aptX HD aqtitle AQTitle subtitles argo_asf Argonaut Games ASF argo_brp Argonaut Games BRP argo_cvg Argonaut Games CVG asf ASF (Advanced / Active Streaming Format) asf_o ASF (Advanced / Active Streaming Format) ass SSA (SubStation Alpha) subtitle ast AST (Audio Stream) au Sun AU av1 AV1 Annex B avi AVI (Audio Video Interleaved) avr AVR (Audio Visual Research) avs Argonaut Games Creature Shock avs2 raw AVS2-P2/IEEE1857.4 avs3 raw AVS3-P2/IEEE1857.10 bethsoftvid Bethesda Softworks VID bfi Brute Force & Ignorance bfstm BFSTM (Binary Cafe Stream) bin Binary text bink Bink binka Bink Audio bit G.729 BIT file format bitpacked Bitpacked bmp_pipe piped bmp sequence bmv Discworld II BMV boa Black Ops Audio bonk raw Bonk brender_pix BRender PIX image brstm BRSTM (Binary Revolution Stream) c93 Interplay C93 caf Apple CAF (Core Audio Format) cavsvideo raw Chinese AVS (Audio Video Standard) cdg CD Graphics cdxl Commodore CDXL video cine Phantom Cine codec2 codec2 .c2 demuxer codec2raw raw codec2 demuxer concat Virtual concatenation script cri_pipe piped cri sequence dash Dynamic Adaptive Streaming over HTTP data raw data daud D-Cinema audio dcstr Sega DC STR dds_pipe piped dds sequence derf Xilam DERF dfa Chronomaster DFA dfpwm raw DFPWM1a dhav Video DAV dirac raw Dirac dnxhd raw DNxHD (SMPTE VC-3) dpx_pipe piped dpx sequence dsf DSD Stream File (DSF) dshow DirectShow capture dsicin Delphine Software International CIN dss Digital Speech Standard (DSS) dts raw DTS dtshd raw DTS-HD dv DV (Digital Video) dvbsub raw dvbsub dvbtxt dvbtxt dxa DXA ea Electronic Arts Multimedia ea_cdata Electronic Arts cdata eac3 raw E-AC-3 epaf Ensoniq Paris Audio File exr_pipe piped exr sequence f32be PCM 32-bit floating-point big-endian f32le PCM 32-bit floating-point little-endian f64be PCM 64-bit floating-point big-endian f64le PCM 64-bit floating-point little-endian ffmetadata FFmpeg metadata in text film_cpk Sega FILM / CPK filmstrip Adobe Filmstrip fits Flexible Image Transport System flac raw FLAC flic FLI/FLC/FLX animation flv FLV (Flash Video) frm Megalux Frame fsb FMOD Sample Bank fwse Capcom's MT Framework sound g722 raw G.722 g723_1 G.723.1 g726 raw big-endian G.726 ("left aligned") g726le raw little-endian G.726 ("right aligned") g729 G.729 raw format demuxer gdigrab GDI API Windows frame grabber gdv Gremlin Digital Video gem_pipe piped gem sequence genh GENeric Header gif CompuServe Graphics Interchange Format (GIF) gif_pipe piped gif sequence gsm raw GSM gxf GXF (General eXchange Format) h261 raw H.261 h263 raw H.263 h264 raw H.264 video hca CRI HCA hcom Macintosh HCOM hdr_pipe piped hdr sequence hevc raw HEVC video hls Apple HTTP Live Streaming hnm Cryo HNM v4 ico Microsoft Windows ICO idcin id Cinematic idf iCE Draw File iff IFF (Interchange File Format) ifv IFV CCTV DVR ilbc iLBC storage image2 image2 sequence image2pipe piped image2 sequence imf IMF (Interoperable Master Format) ingenient raw Ingenient MJPEG ipmovie Interplay MVE ipu raw IPU Video ircam Berkeley/IRCAM/CARL Sound Format iss Funcom ISS iv8 IndigoVision 8000 video ivf On2 IVF ivr IVR (Internet Video Recording) j2k_pipe piped j2k sequence jacosub JACOsub subtitle format jpeg_pipe piped jpeg sequence jpegls_pipe piped jpegls sequence jpegxl_pipe piped jpegxl sequence jv Bitmap Brothers JV kux KUX (YouKu) kvag Simon & Schuster Interactive VAG laf LAF (Limitless Audio Format) lavfi Libavfilter virtual input device live_flv live RTMP FLV (Flash Video) lmlm4 raw lmlm4 loas LOAS AudioSyncStream lrc LRC lyrics luodat Video CCTV DAT lvf LVF lxf VR native stream (LXF) m4v raw MPEG-4 video matroska,webm Matroska / WebM mca MCA Audio Format mcc MacCaption mgsts Metal Gear Solid: The Twin Snakes microdvd MicroDVD subtitle format mjpeg raw MJPEG video mjpeg_2000 raw MJPEG 2000 video mlp raw MLP mlv Magic Lantern Video (MLV) mm American Laser Games MM mmf Yamaha SMAF mods MobiClip MODS moflex MobiClip MOFLEX mov,mp4,m4a,3gp,3g2,mj2 QuickTime / MOV mp3 MP2/3 (MPEG audio layer 2/3) mpc Musepack mpc8 Musepack SV8 mpeg MPEG-PS (MPEG-2 Program Stream) mpegts MPEG-TS (MPEG-2 Transport Stream) mpegtsraw raw MPEG-TS (MPEG-2 Transport Stream) mpegvideo raw MPEG video mpjpeg MIME multipart JPEG mpl2 MPL2 subtitles mpsub MPlayer subtitles msf Sony PS3 MSF msnwctcp MSN TCP Webcam stream msp Microsoft Paint (MSP)) mtaf Konami PS2 MTAF mtv MTV mulaw PCM mu-law musx Eurocom MUSX mv Silicon Graphics Movie mvi Motion Pixels MVI mxf MXF (Material eXchange Format) mxg MxPEG clip nc NC camera feed nistsphere NIST SPeech HEader REsources nsp Computerized Speech Lab NSP nsv Nullsoft Streaming Video nut NUT nuv NuppelVideo obu AV1 low overhead OBU ogg Ogg oma Sony OpenMG audio paf Amazing Studio Packed Animation File pam_pipe piped pam sequence pbm_pipe piped pbm sequence pcx_pipe piped pcx sequence pfm_pipe piped pfm sequence pgm_pipe piped pgm sequence pgmyuv_pipe piped pgmyuv sequence pgx_pipe piped pgx sequence phm_pipe piped phm sequence photocd_pipe piped photocd sequence pictor_pipe piped pictor sequence pjs PJS (Phoenix Japanimation Society) subtitles pmp Playstation Portable PMP png_pipe piped png sequence pp_bnk Pro Pinball Series Soundbank ppm_pipe piped ppm sequence psd_pipe piped psd sequence psxstr Sony Playstation STR pva TechnoTrend PVA pvf PVF (Portable Voice Format) qcp QCP qdraw_pipe piped qdraw sequence qoi_pipe piped qoi sequence r3d REDCODE R3D rawvideo raw video realtext RealText subtitle format redspark RedSpark rka RKA (RK Audio) rl2 RL2 rm RealMedia roq id RoQ rpl RPL / ARMovie rsd GameCube RSD rso Lego Mindstorms RSO rtp RTP input rtsp RTSP input s16be PCM signed 16-bit big-endian s16le PCM signed 16-bit little-endian s24be PCM signed 24-bit big-endian s24le PCM signed 24-bit little-endian s32be PCM signed 32-bit big-endian s32le PCM signed 32-bit little-endian s337m SMPTE 337M s8 PCM signed 8-bit sami SAMI subtitle format sap SAP input sbc raw SBC (low-complexity subband codec) sbg SBaGen binaural beats script scc Scenarist Closed Captions scd Square Enix SCD sdns Xbox SDNS sdp SDP sdr2 SDR2 sds MIDI Sample Dump Standard sdx Sample Dump eXchange ser SER (Simple uncompressed video format for astronomical capturing) sga Digital Pictures SGA sgi_pipe piped sgi sequence shn raw Shorten siff Beam Software SIFF simbiosis_imx Simbiosis Interactive IMX sln Asterisk raw pcm smjpeg Loki SDL MJPEG smk Smacker smush LucasArts Smush sol Sierra SOL sox SoX native spdif IEC 61937 (compressed data in S/PDIF) srt SubRip subtitle stl Spruce subtitle format subviewer SubViewer subtitle format subviewer1 SubViewer v1 subtitle format sunrast_pipe piped sunrast sequence sup raw HDMV Presentation Graphic Stream subtitles svag Konami PS2 SVAG svg_pipe piped svg sequence svs Square SVS swf SWF (ShockWave Flash) tak raw TAK tedcaptions TED Talks captions thp THP tiertexseq Tiertex Limited SEQ tiff_pipe piped tiff sequence tmv 8088flex TMV truehd raw TrueHD tta TTA (True Audio) tty Tele-typewriter txd Renderware TeXture Dictionary ty TiVo TY Stream u16be PCM unsigned 16-bit big-endian u16le PCM unsigned 16-bit little-endian u24be PCM unsigned 24-bit big-endian u24le PCM unsigned 24-bit little-endian u32be PCM unsigned 32-bit big-endian u32le PCM unsigned 32-bit little-endian u8 PCM unsigned 8-bit v210 Uncompressed 4:2:2 10-bit v210x Uncompressed 4:2:2 10-bit vag Sony PS2 VAG vbn_pipe piped vbn sequence vc1 raw VC-1 vc1test VC-1 test bitstream vfwcap VfW video capture vidc PCM Archimedes VIDC vividas Vividas VIV vivo Vivo vmd Sierra VMD vobsub VobSub subtitle format voc Creative Voice vpk Sony PS2 VPK vplayer VPlayer subtitles vqf Nippon Telegraph and Telephone Corporation (NTT) TwinVQ w64 Sony Wave64 wady Marble WADY wav WAV / WAVE (Waveform Audio) wavarc Waveform Archiver wc3movie Wing Commander III movie webm_dash_manifest WebM DASH Manifest webp_pipe piped webp sequence webvtt WebVTT subtitle wsaud Westwood Studios audio wsd Wideband Single-bit Data (WSD) wsvqa Westwood Studios VQA wtv Windows Television (WTV) wv WavPack wve Psion 3 audio xa Maxis XA xbin eXtended BINary text (XBIN) xbm_pipe piped xbm sequence xmd Konami XMD xmv Microsoft XMV xpm_pipe piped xpm sequence xvag Sony PS3 XVAG xwd_pipe piped xwd sequence xwma Microsoft xWMA yop Psygnosis YOP yuv4mpegpipe YUV4MPEG pipe ```

Notes:

Permissions

You may need to declare & request internet access or file-system permissions depending upon platform.

Android

Edit android/app/src/main/AndroidManifest.xml to add the following permissions inside <manifest> tag:

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.app">
    <application
      ...
      />
    </application>
    <!--
      Internet access permissions.
      -->
    <uses-permission android:name="android.permission.INTERNET" />
    <!--
      Media access permissions.
      Android 13 or higher.
      https://developer.android.com/about/versions/13/behavior-changes-13#granular-media-permissions
      -->
    <uses-permission android:name="android.permission.READ_MEDIA_AUDIO" />
    <uses-permission android:name="android.permission.READ_MEDIA_VIDEO" />
    <!--
      Storage access permissions.
      Android 12 or lower.
      -->
    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
</manifest>

Use package:permission_handler to request access at runtime:

if (/* Android 13 or higher. */) {
  // Video permissions.
  if (await Permission.videos.isDenied || await Permission.videos.isPermanentlyDenied) {
    final state = await Permission.videos.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
  // Audio permissions.
  if (await Permission.audio.isDenied || await Permission.audio.isPermanentlyDenied) {
    final state = await Permission.audio.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
} else {
  if (await Permission.storage.isDenied || await Permission.storage.isPermanentlyDenied) {
    final state = await Permission.storage.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
}

iOS

Edit ios/Runner/Info-Release.plist, ios/Runner/Info-Profile.plist, ios/Runner/Info-Debug.plist:

Enable internet access

<key>NSAppTransportSecurity</key>
<dict>
    <key>NSAllowsArbitraryLoads</key>
    <true/>
</dict>

Windows

N/A

macOS

Edit macos/Runner/Release.entitlements & macos/Runner/DebugProfile.entitlements:

Enable internet access

<key>com.apple.security.network.client</key>
<true/>

Disable sand-box to access files

<key>com.apple.security.app-sandbox</key>
<false/>

GNU/Linux

N/A

Web

N/A

Notes

Android

N/A

iOS

N/A

Windows

N/A

macOS

During the build phase, the following warnings are not critical and cannot be silenced:

#import "Headers/media_kit_video-Swift.h"
        ^
/path/to/media_kit/media_kit_test/build/macos/Build/Products/Debug/media_kit_video/media_kit_video.framework/Headers/media_kit_video-Swift.h:270:31: warning: 'objc_ownership' only applies to Objective-C object or block pointer types; type here is 'CVPixelBufferRef' (aka 'struct __CVBuffer *')
- (CVPixelBufferRef _Nullable __unsafe_unretained)copyPixelBuffer SWIFT_WARN_UNUSED_RESULT;
# 1 "<command line>" 1
 ^
<command line>:20:9: warning: 'POD_CONFIGURATION_DEBUG' macro redefined
#define POD_CONFIGURATION_DEBUG 1 DEBUG=1 
        ^
#define POD_CONFIGURATION_DEBUG 1
        ^

GNU/Linux

Install libmpv

System shared libraries from distribution specific user-installed packages are used by-default. This is how GNU/Linux works. You can install these as follows:

Ubuntu/Debian
sudo apt install libmpv-dev mpv
Packaging

There are other ways to bundle these within your app package e.g. within Snap or Flatpak. Few examples:

Utilize mimalloc

You should consider replacing the default memory allocator with mimalloc for avoiding memory leaks.

This is as simple as adding one line to linux/CMakeLists.txt:

target_link_libraries(${BINARY_NAME} PRIVATE ${MIMALLOC_LIB})

Web

On the web, libmpv is not used. Video & audio playback is handled by embedding HTML <video> element. The format support depends upon the web browser. It happens to be extremely limited as compared to native platforms.

Architecture

package:media_kit

Click on the zoom button on top-right or pinch inside.

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  Player *-- PlatformPlayer
  PlatformPlayer <|-- NativePlayer
  PlatformPlayer <|-- WebPlayer
  PlatformPlayer *-- PlayerState
  PlatformPlayer *-- PlayerStream
  PlatformPlayer o-- PlayerConfiguration

  NativePlayer <.. NativeLibrary
  NativePlayer <.. Initializer
  Initializer o-- InitializerIsolate
  Initializer o-- InitializerNativeEventLoop

  Playable <.. Media
  Playable <.. Playlist

  class Initializer {
    +create(path: String, callback: Function, options: Map<String, String>): Future<Pointer<mpv_handle>>
    +dispose(handle: Pointer<mpv_handle>)
  }

  class InitializerIsolate {
    +create(path: String, callback: Function, options: Map<String, String>): Future<Pointer<mpv_handle>>
    +dispose(handle: Pointer<mpv_handle>)
  }

  class InitializerNativeEventLoop {
    +ensureInitialized()
    +create(path: String, callback: Future<void> Function(Pointer<mpv_event> event), options: Map<String, String>): Future<Pointer<mpv_handle>>
    +dispose(handle: Pointer<mpv_handle>)
  }

  class Playable {
  }

  class AudioDevice {
  }

  class Media {
    +String uri
    +dynamic extras
  }

  class Playlist {
    +List<Media> medias
    +index index
  }

  class PlayerStream {
    +Stream<Playlist> playlist
    +Stream<bool> playing
    +Stream<bool> completed
    +Stream<Duration> position
    +Stream<Duration> duration
    +Stream<Duration> buffer
    +Stream<double> volume
    +Stream<double> rate
    +Stream<double> pitch
    +Stream<bool> buffering
    +Stream<Duration> buffer
    +Stream<AudioParams> audioParams
    +Stream<VideoParams> videoParams
    +Stream<double?> audioBitrate
    +Stream<AudioDevice> audioDevice
    +Stream<List<AudioDevice>> audioDevices
    +Stream<Track> track
    +Stream<Tracks> tracks
    +Stream<int> width
    +Stream<int> height
    +Stream<List<String>> subtitle
    +Stream<PlayerLog> log
    +Stream<String> error
  }

  class PlayerState {
    +Playlist playlist
    +bool playing
    +bool completed
    +Duration position
    +Duration duration
    +Duration buffer
    +double volume
    +double rate
    +double pitch
    +bool buffering
    +Duration buffer
    +AudioParams audioParams
    +VideoParams videoParams
    +double? audioBitrate
    +AudioDevice audioDevice
    +List<AudioDevice audioDevices
    +Track track
    +Tracks tracks
    +int width
    +int height
    +List<String> subtitle
  }

  class Player {
    +PlatformPlayer? platform

    +«get» PlayerState state
    +«get» PlayerStream stream

    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List
  }

  class PlatformPlayer {
    +PlayerState state
    +PlayerStream stream
    +PlayerConfiguration configuration

    +dispose()*
    +open(playable: Playable)*
    +play()*
    +stop()*
    +pause()*
    +playOrPause()*
    +add(media: Media)*
    +remove(index: int)*
    +next()*
    +previous()*
    +jump(index: int)*
    +move(from: int, to: int)*
    +seek(duration: Duration)*
    +setPlaylistMode(playlistMode: PlaylistMode)*
    +setVolume(volume: double)*
    +setRate(rate: double)*
    +setPitch(pitch: double)*
    +setShuffle(bool: double)*
    +setAudioDevice(device: AudioDevice)*
    +setVideoTrack(track: VideoTrack)*
    +setAudioTrack(track: AudioTrack)*
    +setSubtitleTrack(track: SubtitleTrack)*
    +screenshot(): Uint8List*

    +«get» handle: Future<int>*

    #StreamController<Playlist> playlistController
    #StreamController<bool> playingController
    #StreamController<bool> completedController
    #StreamController<Duration> positionController
    #StreamController<Duration> durationController
    #StreamController<Duration> bufferController
    #StreamController<double> volumeController
    #StreamController<double> rateController
    #StreamController<double> pitchController
    #StreamController<bool> bufferingController
    #StreamController<PlayerLog> logController
    #StreamController<PlayerError> errorController
    #StreamController<AudioParams> audioParamsController
    #StreamController<double?> audioBitrateController
    #StreamController<AudioDevice> audioDeviceController
    #StreamController<List<AudioDevice>> audioDevicesController
    #StreamController<Track> trackController
    #StreamController<Tracks> tracksController
    #StreamController<int> widthController
    #StreamController<int> heightController
  }

  class NativePlayer {
    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List

    +«get» handle: Future<int>
  }

  class WebPlayer {
    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List

    +«get» handle: Future<int>
  }

  class NativeLibrary {
    +find()$ String?
  }

package:media_kit_video

Click on the zoom button on top-right or pinch inside.

Android

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Create VideoOutput(s) to send back id & wid for render. Dispose to release.
  VideoOutput <.. MediaKitAndroidHelper: Create & dispose JNI global object reference to android.view.Surface (for --wid)

  class MediaKitVideoPlugin {
    +Activity activity$
    -MethodChannel channel
    -TextureRegistry textureRegistry
    -VideoOutputManager videoOutputManager
  }

  class VideoOutputManager {
    -HashMap<Long, VideoOutput> videoOutputs
    -MethodChannel channelReference
    -TextureRegistry textureRegistryReference
    -Object lock

    +create(handle: long): VideoOutput
    +dispose(handle: long): void
    +createSurface(handle: long): long
    +setSurfaceTextureSize(handle: long, width: int, height: int): void
  }

  class VideoOutput {
    +long id
    +long wid

    -Surface surface
    -TextureRegistry.SurfaceTextureEntry surfaceTextureEntry
    -Method newGlobalObjectRef
    -Method deleteGlobalObjectRef

    -long handle
    -MethodChannel channelReference
    -TextureRegistry textureRegistryReference

    +dispose()
    +createSurface(): long
    +setSurfaceTextureSize(width: int, height: int)
  }

  class MediaKitAndroidHelper {
    +newGlobalObjectRef(obj: Object): long
    +deleteGlobalObjectRef(ref: long): void
    +setApplicationContext(context: Context): void
    +copyAssetToExternalFilesDir(assetName: String): String
  }

iOS

TODO: documentation.

macOS

TODO: documentation.

Windows

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes PluginRegistrarWindows as reference
  VideoOutputManager "1" *-- "1" ThreadPool
  VideoOutput "*" o-- "1" ThreadPool: Post creation, resize & render etc. tasks involving EGL to ensure synchronous EGL/ANGLE usage across multiple VideoOutput(s)
  VideoOutput "1" *-- "1" ANGLESurfaceManager: Only for H/W accelerated rendering

  class MediaKitVideoPlugin {
    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<MethodChannel> channel_
    -std::unique_ptr<VideoOutputManager> video_output_manager_
    -HandleMethodCall(method_call, result);
  }

  class ThreadPool {
    +Post(function: std::function)
  }

  class VideoOutputManager {
    +Create(handle: int, width: optional<int>, height: optional<int>, texture_update_callback: std::function)
    +Dispose(handle: int)

    -std::mutex mutex_
    -std::unique_ptr<ThreadPool> thread_pool_
    -flutter::PluginRegistrarWindows registrar_
    -std::unordered_map<int64_t, std::unique_ptr<VideoOutput>> video_outputs_
  }

  class VideoOutput {
    +«get» texture_id: int64_t
    +«get» width: int64_t
    +«get» height: int64_t
    -mpv_handle* handle_
    -mpv_render_context* render_context_
    -std::optional<int64_t> width_
    -std::optional<int64_t> height_
    -bool enable_hardware_acceleration_
    -int64_t texture_id_
    -flutter::PluginRegistrarWindows registrar_
    -ThreadPool* thread_pool_ref_
    -bool destroyed_
    -std::mutex textures_mutex_
    -std::unordered_map<int64_t, std::unique_ptr<flutter::TextureVariant>> texture_variants_
    -std::unique_ptr<ANGLESurfaceManager> surface_manager_ HW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopGpuSurfaceDescriptor>> textures_ HW
    -std::unique_ptr<uint8_t[]> pixel_buffer_ SW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopPixelBuffer>> pixel_buffer_textures_ SW
    -std::function texture_update_callback_

    +SetTextureUpdateCallback(callback: std::function<void(int64_t, int64_t, int64_t)>)
    +SetSize(width: std::optional<int64_t>, height: std::optional<int64_t>)
    -NotifyRender()
    -Render()
    -CheckAndResize()
    -Resize(required_width: int64_t, required_height: int64_t)
    -GetVideoWidth(): int64_t
    -GetVideoHeight(): int64_t
  }

  class ANGLESurfaceManager {
    +«get» width: int32_t
    +«get» height: int32_t
    +«get» handle: HANDLE

    +HandleResize(width: int32_t, height: int32_t)
    +Draw(draw_callback: std::function<void()>)
    +Read()
    +MakeCurrent(value: bool)
    -CreateEGLDisplay()
    -SwapBuffers()
    -Create()
    -CleanUp(release_context: bool)
    -CreateD3DTexture()
    -CreateEGLDisplay()
    -CreateAndBindEGLSurface()

    -IDXGIAdapter* adapter_
    -int32_t width_
    -int32_t height_
    -HANDLE internal_handle_
    -HANDLE handle_
    -HANDLE mutex_
    -ID3D11Device* d3d_11_device_
    -ID3D11DeviceContext* d3d_11_device_context_
    -Microsoft::WRL::ComPtr<ID3D11Texture2D> internal_d3d_11_texture_2D_
    -Microsoft::WRL::ComPtr<IDXGISwapChain> d3d_11_texture_2D_
    -EGLSurface surface_
    -EGLDisplay display_
    -EGLContext context_
    -EGLConfig config_
  }

GNU/Linux

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes FlTextureRegistrar as reference
  VideoOutput "1" *-- "1" TextureGL: For H/W rendering.
  TextureGL "1" o-- "1" VideoOutput: Take VideoOutput as reference
  VideoOutput "1" *-- "1" TextureSW: For S/W rendering.
  TextureSW "1" o-- "1" VideoOutput: Take VideoOutput as reference
  TextureGL "1" <-- "1" FlTextureGL
  TextureSW "1" <-- "1" FlTexture

  class MediaKitVideoPlugin {
    -FlMethodChannel* channel
    -VideoOutputManager* video_output_manager
  }

  class VideoOutputManager {
    -GHashTable* video_outputs
    -FlTextureRegistrar* texture_registrar
    +video_output_manager_create(self: VideoOutputManager*, handle: gint64, width: gint64, height: gint64, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_manager_dispose(self: VideoOutputManager*, handle: gint64)
  }

  class VideoOutput {
    -TextureGL* texture_gl
    -GdkGLContext* context_gl
    -mpv_handle* handle
    -mpv_render_context* render_context
    -gint64 width
    -gint64 height
    -TextureUpdateCallback texture_update_callback
    -gpointer texture_update_callback_context
    -FlTextureRegistrar* texture_registrar
    +video_output_set_texture_update_callback(self: VideoOutput*, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_get_render_context(self: VideoOutput*): mpv_render_context*
    +video_output_get_width(self: VideoOutput*): gint64
    +video_output_get_height(self: VideoOutput*): gint64
    +video_output_get_texture_id(self: VideoOutput*): gint64
    +video_output_notify_texture_update(self: VideoOutput*);
  }

  class TextureGL {
    -guint32 name
    -guint32 fbo
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_gl_populate_texture(texture: FlTextureGL*, target: guint32*, name: guint32*, width: guint32*, height: guint32*, error: GError**): gboolean
  }

  class TextureSW {
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_sw_copy_pixels(texture: FlPixelBufferTexture*, buffer: const uint8_t**, width: uint32_t*, height: uint32_t*, error: GError**): gboolean
  }

Web

TODO: documentation.

Implementation

libmpv is used for leveraging audio & video playback. It seems the best possible option since supports a wide variety of audio & video formats, provides hardware acceleration & bundle size is also minimal (select only required decoders etc. in FFmpeg/mpv).

Another major advantage is that large part of implementation (80%+) is shared across platforms using FFI. This makes the behavior of package very-very similar on all supported platforms & makes maintenance easier (since there is less code & most of it within Dart).

Alternative backends may be implemented in future to meet certain demands (& project architecture makes it possible).

package:media_kit

package:media_kit is entirely written in Dart. It uses dart:ffi to invoke native C API of libmpv through it's shared libraries. All the callback management, event-Streams, other methods to control playback of audio/video are implemented in Dart with the help of FFI. Event management i.e. position, duration, bitrate, audioParams Streams are important to render changes in the UI.

A big limitation with FFI in Dart SDK has been that it does not support async callbacks from another thread. Learn more about this at: dart/sdk#37022. Following situation will explain better:

If you pass a function pointer from Dart to C code, you can invoke it fine. But, as soon as you invoke it from some other thread on the native side, Dart VM will instantly crash. This feature is important because most events take place on a background thread.

However, I could easily do this within Dart because libmpv offers an "event polling"-like way to listen to events. I got awesome idea to spawn a background Isolate, where I run the event-loop. I get the memory address of each event and forward it outside the Isolate with the help of ReceivePort, where I finally interpret it using more FFI code. I have explained this in detail within the in-code comments of initializer.dart, where I had to perform a lot more trickery to get this to work.

Thus, invoking native methods & handling of events etc. could be done within 100% Dart using FFI. This is enough for audio playback & supports both Flutter SDK & Dart VM. Although event handling works entirely within Dart. Later, it was discovered that going beyond certain number of simultaneous instances caused a deadlock (dart-lang/sdk#51254 & dart-lang/sdk#51261), making UI entirely freezed along-side any other Dart code in execution. To deal with this, a new package package:media_kit_native_event_loop is created. Adding package:media_kit_native_event_loop to pubspec.yaml automatically resolves this issue without any chagnes to code!

However, no such "event-polling" like API is possible for video rendering. So, I best idea seemed to create a new package package:media_kit_video for specifically offering platform-specific video embedding implementation which internally handles Flutter's Texture Registry API & libmpv's OpenGL rendering API. This package only consumes the mpv_handle* (which can be shared as primitive int value easily) of the instance (created with package:media_kit through FFI) to setup a new viewport. Detailed implementation is discussed below.

package:media_kit_native_event_loop

Platform specific threaded event handling for media_kit. Enables support for higher number of concurrent instances.

The package contains a minimal C++ implementation which spawns a detach-ed std::thread. This runs the mpv_wait_event loop & forwads the events using postCObject, SendPort & ReceivePort to Dart VM. Necessary mutex synchronization also takes place.

Isolate based event loop is avoided once this package is added to the project.

package:media_kit_video

Android

On Android, texture registry API is based on android.graphics.SurfaceTexture.

libmpv can render directly onto an android.view.Surface after setting --wid. Creation of a new android.view.Surface requires reference to an existing android.graphics.SurfaceTexture, [which can be consumed from the texture entry created by Flutter itself](https://api.flutter.dev/javadoc/io/flutter/view/TextureRegistry.SurfaceTextureEntry.html#surfaceTexture()).

This requires --hwdec=mediacodec for hardware decoding, along with --vo=mediacodec_embed and --wid=(intptr_t)(*android.view.Surface).

More details may be found at: https://mpv.io/manual/stable/#video-output-drivers-mediacodec-embed

Obtaining a global reference pointer to a Java object (android.view.Surface in our case) requires JNI. For this, a custom shared library is used, you can find it's implementation at media-kit/media-kit-android-helper. Since compilation of this would require NDK (& make process tedious), pre-built shared libraries is bundled for each architecture at the time of development/build.

Since the package:media_kit is a Dart package (which works independent of Flutter), accessing assets was a challenging part. The mentioned shared libraries generated by media-kit/media-kit-android-helper helps to access assets bundled inside Android APK from Dart (using FFI, without depending on Flutter).

iOS

iOS shares much of it's implementation with macOS. Only difference is that OpenGL ES is used instead of OpenGL.

macOS

On macOS the current implementation is based on libmpv and can be summarized as follows:

  1. H/W video decoding: mpv option hwdec is set to auto, does not depend on a pixel buffer.
  2. OpenGL rendering to an OpenGL texture backed by a pixel buffer, which makes it interoperable with METAL (CVPixelBuffer)

Windows

The two APIs above are hardware accelerated i.e. GPU backed buffers are used. This is performant approach, easily capable for rendering 4K 60 FPS videos, rest depends on the hardware. Since libmpv API is OpenGL based & the Texture API in Flutter is Direct3D based, ANGLE (Almost Native Graphics Layer Engine) is used for interop, which translates the OpenGL ES 2.0 calls into Direct3D.

This hardware-accelerated video output requires DirectX 11 or higher. Most Windows systems with either integrated or discrete GPUs should support this already. On systems where Direct3D fails to load due to missing graphics drivers or unsupported feature-level or DirectX version etc. a fallback pixel-buffer based software renderer is used. This means that video is rendered by CPU & every frame is copied back to the RAM. This will cause some redundant load on the CPU, result in decreased battery life & may not play higher resolution videos properly. However, it works well.

Windows 7 & 8.x also work correctly. ![0](https://user-images.githubusercontent.com/28951144/212947036-4a2430d6-729e-47d7-a356-c8cc8534a1aa.jpg) ![1](https://user-images.githubusercontent.com/28951144/212947046-cc8d441c-96f8-4437-9f59-b4613ca73f2a.jpg)

You may visit experimentation repository to see a minimal example showing OpenGL ES usage in Flutter Windows.

GNU/Linux

On Flutter Linux, both OpenGL (H/W) & pixel buffer (S/W) APIs are available for rendering on Texture widget.

Web

Video & audio playback is handled by embedding HTML <video> element.

License

Copyright © 2021 & onwards, Hitesh Kumar Saini <saini123hitesh@gmail.com>

This project & the work under this repository is governed by MIT license that can be found in the LICENSE file.