shaka-project / shaka-streamer

A simple config-file based approach to preparing streaming media, based on FFmpeg and Shaka Packager.
https://shaka-project.github.io/shaka-streamer/
Apache License 2.0
198 stars 62 forks source link

How to configure utc_timings for LL-DASH? #108

Closed je3f0o closed 2 years ago

je3f0o commented 2 years ago

Hello, I cannot figure it out how to configure utc_timings in pipeline and keep getting an error:

Fatal error:
  In PipelineConfig, utc_timings field requires a list of UtcTimingPair

Based on error message it should be a list and I tried in my pipeline.yaml

utc_timings:
  - urn:mpeg:dash:utc:http-head:2014=https://<mysite>/api/v2/time

Even I tried to split by : and try to put them in list like

utc_timings:
  - urn
  - mpeg
  - dash
  - ...

I cannot find example about how to use utc_timings in shaka-streamer config file on internet from anywhere...

Also an another side question. I set my segment_size: 6 in my pipeline.yaml. But I saw 12 seconds segments in my stream.m3u8 files. I wrote a simple bash script using ffmpeg named pipes and shaka-packager segment_size was exactly what I was intented. But shaka-streamer segments_size doesn't match my config files... why?

By the way I use Linux 64 bit binaries shaka-streamer version: 0.5.1 packager: v2.6.1-634af65-release ffmpeg: n4.4

joeyparrish commented 2 years ago

To see how utc_timings should be formatted, go to the docs and click on the config reference, then search for utc_timings. There you'll see that it's a list of UtcTimingPair objects, and you can click on UtcTimingPair to see the definition. You should do it like this:

utc_timings:
  - scheme_id_uri: "urn:mpeg:dash:utc:http-head:2014"
    value: "https://<mysite>/api/v2/time"

As for your segment size, please share a pipeline config with us so we can check the format. segment_size should be at the root of your pipeline config.

Do you get the same doubled segment size in audio, or just in video? Is your video content interlaced, by chance?

github-actions[bot] commented 2 years ago

@je3f0o Does this answer all your questions? If so, would you please close the issue?

je3f0o commented 2 years ago

Hello, sorry for long respond.

This is my pipeline.yaml file. it's pretty much copy paste from examples.

# Streaming mode.  Can be live or vod.
streaming_mode: vod

debug_logs: True

# A list of resolutions to encode.
# For VOD, you can specify many more resolutions than you would with live,
# since the encoding does not need to be done in real time.
resolutions:
  - 1080p
  - 720p
  - 480p
  - 240p

# A list of channel layouts to encode.
channel_layouts:
  - stereo

# The codecs to encode with.
audio_codecs:
  - aac
  - opus
video_codecs:
  - h264
  - vp9

# Manifest format (dash, hls or both)
manifest_format:
  - dash
  - hls

hls_output: hls.m3u8
dash_output: dash.mpd

segment_folder: segments

# Length of each segment in seconds.
segment_size: 6

# Forces the use of SegmentTemplate in DASH.
segment_per_file: True

Maybe the machine wasn't enough power to encode hls and dash both together in 6 seconds. idk...

Also i found another problem while using RTMP input.

inputs:
  - name: "rtmp://localhost/live/6df2dc85-78f6-11ec-acbe-b07b2509af55"
    media_type: video

  # A second track (audio) from the same input file.
  - name: "rtmp://localhost/live/6df2dc85-78f6-11ec-acbe-b07b2509af55"
    media_type: audio

In this case audio and video wasn't syncing. I saw from logs ffmpeg was using 2 inputs instead of mapping. That is why audio and video wasn't syncing I guess...

ffmpeg -i input1 -i input2 ...

So I ended up using my own scripts.