unosquare / ffmediaelement

FFME: The Advanced WPF MediaElement (based on FFmpeg)
https://unosquare.github.io/ffmediaelement/
Other
1.18k stars 244 forks source link

Feature/data stream #442

Closed sylvaneau closed 4 years ago

sylvaneau commented 5 years ago

FFmpeg is able to decode Data streams as well as Video, Audio or Subtitle streams.

One "common" example of data stream is provided by MPEG2-TS containers vidéos produced by "drones". These videos include a data stream with data encoded using the KLV (Key Length Value) format. This stream contains the drone position and attitude.

We use your player to display drone videos and we had implemented the data stream decoding feature.

As there are very few codecs available for data stream decoding/encoding, we've decided to bypass the "codec" part and to materialize Frame/Blocks with the byte array containing the actual data. The decoding process is done externally.

There are few KLV decoding libraries, we have developed our own and we are currently discussing to make it open source.

Here is an example of video with data stream : http://samples.ffmpeg.org/MPEG2/mpegts-klv/Day%20Flight.mpg

Just a clue about how PTS (presentation time stamp) are computed for data packet. There is two ways to encode PTS for packet in data streams :

This way the events linked to the Data stream are raised as possible to their actual position in the source stream.

I hope everything is clear and I would be glad to see our contribution merged in this great project ;-)

Regards

phamvv commented 5 years ago

Thanks for your contributions!

mariodivece commented 4 years ago

Codacy Here is an overview of what got changed by this pull request:


Complexity increasing per file
==============================
- Unosquare.FFME/Constants.cs  2
- Unosquare.FFME/Container/DataComponent.cs  5
- Unosquare.FFME.Windows/Common/RenderingDataEventArgs.cs  2
- Unosquare.FFME/Container/MediaContainer.cs  1
- Unosquare.FFME/Container/MediaComponent.cs  2
- Unosquare.FFME.Windows/Rendering/DataRenderer.cs  3
- Unosquare.FFME/Engine/MediaEngineState.cs  5
- Unosquare.FFME.Windows/Platform/MediaConnector.cs  1
- Unosquare.FFME/Container/DataFrame.cs  3
- Unosquare.FFME/Container/DataBlock.cs  1
- Unosquare.FFME/Engine/TimingController.cs  1

See the complete overview on Codacy

mariodivece commented 4 years ago

After careful consideration and testing, I came to the conclusion that data packets have to be processed in a different way than audio, video, or subtitle (multimedia) packets because of the following reasons:

  1. Data packets (data, attachment or unknown) are meant to be handled directly by the user. Information encoded in them has almost always nothing to do with a codec.
  2. Data packets need not to interfere with the timing of multimedia playback. Adding data types and queues that may break the handling of multimedia types would have increased code complexity significantly.
  3. There may be more than 1 data stream available that the user might need access to. Multimedia playback only allows to process 1 stream of each type at a time. I needed a way to provide packets to the user from multiple simultaneous data streams.

While I have taken the main ideas presented in this PR, the approach has changed significantly. You can handle packets manually by subscribing to the DataFrameReceived event which contains the DataFrame. The start time is guessed if no packet timing info is available. The raw data is also available in the PacketData property.

Please check it out and open a new issue if you believe I have missed something. Thanks so much @sylvaneau !