Closed Type1J closed 9 months ago
It can! You need to construct a Muxer
and pass the appropriate options preset. I don't have time to create an example myself but if you get it to work please contribute one here and I'll add it!
I wasn't able to get this working. For the time being, I'm just going to make complete .mp4
files and handle that on the client side. I'd like to move to DASH.js at some point, which requires an init.mp4
that's just the ftyp
and moov
boxes, then .m4s
files, which have only a single moof
and and a single mdat
, like the fragmented files have in series. It's going to be streaming live from a camera, which, in DASH.js, should be similar in setup to the live stream with catch up setup. If you find some time, this would be an invaluable example.
I think it should be supported. Where are you getting stuck? I might be able to help out.
My starting point:
pub async fn generate(output_directory: &str, segment: usize) -> Result<()> {
video_rs::init().expect("Could not init video.");
create_dir_all(&output_directory).await?;
// Create segment-th segment file.
let options = Options::new_with_fragmented_mov();
let (width, height) = (1280, 720);
let encoder_time_base = TIME_BASE;
let frames_per_second = 12;
let seconds_per_fragment = 10;
let duration: Time = Time::from_nth_of_a_second(frames_per_second);
let mut position = Time::zero();
let frames_per_fragment = frames_per_second * seconds_per_fragment;
let degrees_per_frame = 4;
let settings = EncoderSettings::for_h264_yuv420p(width, height, true);
let mut destination = PathBuf::from(output_directory);
destination.push(&format!("video-{segment}.mp4"));
let destination: Locator = destination.into();
let mut encoder =
Encoder::new_with_format_and_options(&destination, settings, "mp4", &options)?;
let start = segment * frames_per_fragment;
let end = start + frames_per_fragment;
for i in start..end {
// This will create a smooth rainbow animation video!
let current = (i * degrees_per_frame) % 360;
let mut frame = rainbow_frame(current as f32 / 360f32, width, height);
frame.set_pts(aligned_with_rational(&position, encoder_time_base).into_value());
encoder.encode_raw(frame)?;
println!("{}", (current as f32 / 360f32) * 360.0);
// Update the current position and add the inter-frame
// duration to it.
position = position.aligned_with(&duration).add();
}
encoder.finish()?;
Ok(())
}
I haven't converted to use the changes that you made, just yet, so it still uses some ffmpeg_next exports, but I think that my problem is a combination of Options
from let options = Options::new_with_fragmented_mov();
not getting observed unless I use a muxer. That's probably a good thing because the BufMuxer
is easy to segment (I think) when writing a fragmented file, but I'm not sure how to make the muxer take my input. I had a conversion from 1 file to another with something like:
let options = Options::new_with_fragmented_mov();
let mut muxer = BufMuxer::new_to_buf_with_options("mp4", options)?;
let mut muxer = muxer.with_streams(&reader)?; // `reader` declared somewhere...
FileMuxer::new_to_file(&PathBuf::from(output_file).into())?.with_streams(&reader)?;
while let Ok(packet) = reader.read(video_stream) {
let buf = muxer.mux(packet)?;
out.write(&buf).await?;
}
muxer.finish()?;
I thought that maybe that loop could be used to create the segment files out of the fragments, if I switched to a BufMuxer
, but I didn't test it, and I wasn't sure how to send frames to the muxer instead of using something like .with_streams(&reader)
above. I thought about making some sort of Reader
from the Encoder
, but I'm not sure how, so I would have to dig around in the code to figure it out somehow, and I wasn't sure if my line of thought was even correct.
I'd like a function that generates the init.mp4
, and one that generates a 10 second segment .m4s
file, as the code above generates a 10 second complete .mp4
file. I'm not sure how to get there.
I hope I explained that well. Let me know if anything is unclear.
I should probably add that the Result
type is from anyhow
, so that I could use ?
. The video_rs::init()
call has a Result
that didn't convert, so I used .expect()
as the other examples did. I also used tokio
for file I/O. If you send back a working example, I can understand any file I/O or error handling method that you want to use. I'm just interested in how to use the library to make files that I can play with DASH. Also, no audio is needed.
Thanks in advance for your help!
Instead of with_streams
you might be able to use with_stream
and then just insert a single stream with ID = 0 i.e. StreamInfo { index: 0 }
then send packets into it.
Anyway, the API still assumes you have Packets
(which are basically encoded frames). This crate does not have an API for encoding packets to memory (since that would complicate the API). Depending on exactly what you are trying to do you could encode your content and then read it with the Reader
to get packets.
I'd like a full example of opening a .mp4 file and outputing -x.m4s files where x is the segment number of that file.
In summary, this should be quite easy! Just create a reader for the file and follow the instructions
I'm probably going to come back here and try using just this library once I have everything working. I'm trying to get the basic video flow of my project going first.
There are a few places in this library that seem to suggest that it can easily create a fragmented/segmented mp4.
I'd like a full example of opening a .mp4 file and outputing -x.m4s files where x is the segment number of that file.