Closed je3f0o closed 2 years ago
To see how utc_timings should be formatted, go to the docs and click on the config reference, then search for utc_timings. There you'll see that it's a list of UtcTimingPair objects, and you can click on UtcTimingPair to see the definition. You should do it like this:
utc_timings:
- scheme_id_uri: "urn:mpeg:dash:utc:http-head:2014"
value: "https://<mysite>/api/v2/time"
As for your segment size, please share a pipeline config with us so we can check the format. segment_size
should be at the root of your pipeline config.
Do you get the same doubled segment size in audio, or just in video? Is your video content interlaced, by chance?
@je3f0o Does this answer all your questions? If so, would you please close the issue?
Hello, sorry for long respond.
This is my pipeline.yaml
file. it's pretty much copy paste from examples.
# Streaming mode. Can be live or vod.
streaming_mode: vod
debug_logs: True
# A list of resolutions to encode.
# For VOD, you can specify many more resolutions than you would with live,
# since the encoding does not need to be done in real time.
resolutions:
- 1080p
- 720p
- 480p
- 240p
# A list of channel layouts to encode.
channel_layouts:
- stereo
# The codecs to encode with.
audio_codecs:
- aac
- opus
video_codecs:
- h264
- vp9
# Manifest format (dash, hls or both)
manifest_format:
- dash
- hls
hls_output: hls.m3u8
dash_output: dash.mpd
segment_folder: segments
# Length of each segment in seconds.
segment_size: 6
# Forces the use of SegmentTemplate in DASH.
segment_per_file: True
Maybe the machine wasn't enough power to encode hls and dash both together in 6 seconds. idk...
Also i found another problem while using RTMP input.
inputs:
- name: "rtmp://localhost/live/6df2dc85-78f6-11ec-acbe-b07b2509af55"
media_type: video
# A second track (audio) from the same input file.
- name: "rtmp://localhost/live/6df2dc85-78f6-11ec-acbe-b07b2509af55"
media_type: audio
In this case audio and video wasn't syncing. I saw from logs ffmpeg
was using 2 inputs instead of mapping. That is why audio and video wasn't syncing I guess...
ffmpeg -i input1 -i input2 ...
So I ended up using my own scripts.
Hello, I cannot figure it out how to configure utc_timings in pipeline and keep getting an error:
Based on error message it should be a list and I tried in my
pipeline.yaml
Even I tried to split by
:
and try to put them in list likeI cannot find example about how to use utc_timings in shaka-streamer config file on internet from anywhere...
Also an another side question. I set my
segment_size: 6
in mypipeline.yaml
. But I saw 12 seconds segments in mystream.m3u8
files. I wrote a simple bash script usingffmpeg
named pipes andshaka-packager
segment_size was exactly what I was intented. Butshaka-streamer
segments_size doesn't match my config files... why?By the way I use Linux 64 bit binaries shaka-streamer version: 0.5.1 packager: v2.6.1-634af65-release ffmpeg: n4.4