angelcam / rust-ac-ffmpeg

Simple and safe Rust interface for FFmpeg libraries.
MIT License
197 stars 33 forks source link

How to achieve transcoding, such as converting H264 video streams to FLVs #76

Open redoriental opened 1 year ago

operutka commented 1 year ago

So is there any question or your question is in the title?

operutka commented 1 year ago

In any case, FLV is just a container (similar to MP4), so there would be no transcoding. The FLV container would still contain h264 video stream (unless you wanted to transcode your video into h265 for example).

There are usage examples where you can see how to use this library: https://github.com/angelcam/rust-ac-ffmpeg/tree/master/ac-ffmpeg/examples

redoriental commented 9 months ago

thankyou,My statements are indeed incorrect. How can I convert the Opus of the audio to Acc

operutka commented 9 months ago

You'll need to decode the Opus audio and encode it as AAC. Check the examples. There are sample apps for decoding and encoding. You can find inspiration there.

redoriental commented 9 months ago

fn convert(input:rw::ClosureReader, mut output: rw::ClosureWriter<fn(&[u8]) ->std::io::Result>) -> Result<(), Error> { println!("0"); let mut demuxer = open_input(input)?; println!("1"); let (streamindex, (stream, )) = demuxer .streams() .iter() .map(|stream| (stream, stream.codecparameters())) .enumerate() .find(|(, (_, params))| params.is_video_codec()) .ok_or_else(|| Error::new("no video stream"))?; match VideoDecoder::from_stream(stream) { Ok(video_decoder_build) => { match video_decoder_build.build() { Ok(mut video_decoder) => { match VideoEncoder::builder("flv") { Ok(video_encoder_build) => { match video_encoder_build .set_option("tune", "zerolatency") .set_option("preset", "ultrafast") .set_option("crf", "28") .set_option("c:a", "acc") .set_option("c:v","libx264") .set_option("f","flv") .height(480).width(640) .bit_rate(2500000) .pixel_format(PixelFormat::from_str("yuv420p").unwrap()) .build() { Ok(mut video_encoder) => { let codec_parameters = video_encoder.codec_parameters().into(); let mut muxer = open_output(output, &[codec_parameters])?; while let Ok(Some(packet)) = demuxer.take() { video_decoder.push(packet); 'one:loop { let frame_r_o = video_decoder.take(); match frame_r_o { Ok(frame_o) => { match frame_o { None => { break 'one; } Some(frame) => { video_encoder.push(frame); 'two: loop { let mux_packet_r_o = video_encoder.take(); match mux_packet_r_o { Ok(mux_packet_o) => { match mux_packet_o { None => {break 'two;} Some(mux_packet) => { muxer.push(muxpacket); } } } Err() => {break 'two;} } } } } } Err(e4) => { break 'one;} } } } } Err(e3) => { eprintln!("4{}", e3); } } } Err(e2) => { eprintln!("3{}", e2); } } } Err(e1) => { eprintln!("2{}", e1); } } } Err(e) => { eprintln!("1{}", e); } } Ok(()) } I encountered an error in recording the webm video stream [h264 @ 0000024E882286C0] No start code is found. [h264 @ 0000024E882286C0] Error splitting the input into NAL units. [h264 @ 0000024E882286C0] No start code is found. [h264 @ 0000024E882286C0] Error splitting the input into NAL units.

operutka commented 9 months ago

This is completely wrong:

match VideoEncoder::builder("flv") {
    Ok(video_encoder_build) => {
        match video_encoder_build
            .set_option("tune", "zerolatency")
            .set_option("preset", "ultrafast")
            .set_option("crf", "28")
            .set_option("c:a", "acc")
            .set_option("c:v","libx264")
            .set_option("f","flv")
            .height(480).width(640)
            .bit_rate(2500000)
            .pixel_format(PixelFormat::from_str("yuv420p").unwrap())
            .build()
        {
            ...
        }
    }
    ...
}

You cannot use the VideoEncoder struct the same way you're using the ffmpeg app. This crate is not the ffmpeg app and it isn't a wrapper around the app either. It is a wrapper around the FFmpeg libraries - a lower-level interface used by the ffmpeg app.

You were able to build the video encoder mostly by accident because flv also happens to be a name of a video codec (it's more or less an alias for H.263) and the options that aren't valid in this context are simply ignored. But clearly you aren't trying to transcode your video into H.263. Based on the parameters, I assume that you're trying to put an H.264 video into an FLV container.

I strongly suggest making yourself familiar with the concept of media codecs, containers and the difference between them. You should be also familiar with the FFmpeg API documentation because this crate is only a wrapper around the FFmpeg libraries. And, please, go through the code examples in this repository carefully.

Once you've done that, you'll understand that:

  1. You'll need to demux your container first.
  2. Then you'll need to process the selected streams in the container individually. You can either: a. copy the packets as they are into your muxer or b. transcode the packets into a different audio/video codec.
  3. Finally you'll need to mux the individual streams into the output container.
redoriental commented 9 months ago

After reading your answer, I noticed my mistake. I know what you mean is that FLV is not an encoding format. I need to decode and transcode the video and audio streams separately. However, how can I merge the two streams into the FLV container afterwards? Thank you for your answer

redoriental commented 9 months ago

When running the code VideoEncoder:: builder ("libx264"), an unknown codec error was reported

redoriental commented 9 months ago

mod rw;

use std::fs::File; use std::io::{Read, Write}; use std::str::FromStr; use std::sync::{Arc, Mutex}; use std::thread; use websocket::sync::{Client, Server}; use websocket::OwnedMessage; use ac_ffmpeg::{ codec::CodecParameters, format::{ demuxer::{Demuxer, DemuxerWithStreamInfo}, io::IO, muxer::{Muxer, OutputFormat}, }, Error, }; use ac_ffmpeg::codec::{AudioCodecParameters, Decoder, Encoder, VideoCodecParameters}; use ac_ffmpeg::codec::audio::{AudioDecoder, AudioDecoderBuilder, AudioEncoder, AudioEncoderBuilder, ChannelLayout, SampleFormat}; use ac_ffmpeg::codec::video::{PixelFormat, VideoDecoder, VideoDecoderBuilder, VideoEncoder, VideoEncoderBuilder, VideoFrame}; use ac_ffmpeg::packet::Packet; use futures::AsyncWriteExt; use websocket::ws::dataframe::DataFrame; use crate::rw::{ClosureReader, ClosureWriter};

/// Open a given input file. fn open_input(write_func:rw::ClosureReader) -> Result<DemuxerWithStreamInfo, Error> { let io = IO::from_read_stream(write_func); Demuxer::builder() .build(io)? .find_stream_info(None) .maperr(|(, err)| err) }

/// Open a given output file. fn open_output(out: rw::ClosureWriter, elementary_streams: &[CodecParameters]) -> (Result<Muxer, Error>, [usize; 2]) { let output_format = OutputFormat::find_by_name("flv").unwrap(); let io = IO::from_write_stream(out); let mut ids = [3,3]; let mut muxer_builder = Muxer::builder();

for codec_parameters in elementary_streams {
    let id = muxer_builder.add_stream(&codec_parameters.clone()).unwrap();
    if codec_parameters.is_video_codec() {
        ids[0] = id;
    }else {
        ids[1] = id;
    }
}
(muxer_builder
    .build(io, output_format),ids)

}

/// Convert a given input file into a given output file. async fn convert(input:rw::ClosureReader, mut output: rw::ClosureWriter) -> Result<(), Error> {

let mut demuxer = open_input(input)?;
let stream_arr = demuxer.streams();
let mut audio_decoder:Option<AudioDecoder> = None;
let mut audio_encoder:Option<AudioEncoder> = None;
let mut video_decoder:Option<VideoDecoder> = None;
let mut video_encoder:Option<VideoEncoder> = None;
let mut has_video = false;
let mut has_audio = false;
let mut video_index = 3;
let mut audio_index = 3;
let mut codec_params = Vec::<CodecParameters>::new();
for stream in stream_arr {
    if stream.codec_parameters().is_audio_codec() {
        match AudioDecoder::from_stream(stream) {
            Ok(audio_decoder_build) => {
                println!("{:?}", stream.codec_parameters().encoder_name());
                match audio_decoder_build.build() {
                    Ok(audio_decoder_) => {
                        audio_decoder = Option::from(audio_decoder_);
                        has_audio = true;
                    }
                    Err(e) => { println!("audio build?? {}", e); }
                }
            }
            Err(e) => { println!("audio??{}", e); }
        }
    }else if stream.codec_parameters().is_video_codec(){
        match VideoDecoder::from_stream(stream) {
            Ok(video_decoder_build) => {
                match video_decoder_build.build() {
                    Ok(video_decoder_) => {
                        video_decoder = Option::from(video_decoder_);
                        has_video = true;
                    }
                    Err(e) => { println!("video build?? {}", e); }
                }
            }
            Err(e) => { println!("video?? {}", e); }
        }
    }
}
if has_video {
    match VideoEncoder::builder("libx264") {//There is an error here
        Ok(video_encoder_build) => {
            match video_encoder_build
                .set_option("tune", "zerolatency")
                .set_option("preset", "ultrafast")
                .set_option("crf", "28")
                .set_option("c:v","libx264")
                .pixel_format(PixelFormat::from_str("yuv420p").unwrap())
                .bit_rate(2500000)
                .width(1408).height(1152)
                .build() {
                Ok(video_encoder_) => {
                    codec_params.push(video_encoder_.codec_parameters().into());
                    video_encoder = Option::from(video_encoder_);
                }
                Err(e) => { println!("video_encoder_build??{}", e); }
            }
        }
        Err(e) => { println!("video_encoder??{}", e); }
    }
}

if has_audio {
    match AudioEncoder::builder("aac") {
        Ok(audio_encoder_build) => {
            match audio_encoder_build
                .sample_format(SampleFormat::from_str("fltp").unwrap())
                .sample_rate(44100)
                .channel_layout(ChannelLayout::from_str("2.1").unwrap())
                .bit_rate(128*1000)
                .build() {
                Ok(audio_encoder_) => {
                    codec_params.push(audio_encoder_.codec_parameters().into());
                    audio_encoder = Option::from(audio_encoder_);
                }
                Err(e) => { println!("audio_encoder_build?? {}", e); }
            }
        }
        Err(e) => { println!("audio_encoder??{}", e); }
    }
}

let (mut muxer,ids) = open_output(output, codec_params.as_slice());
let mut muxer = muxer.unwrap();
video_index = ids[0];
audio_index = ids[1];
let mut demuxer = demuxer.into_demuxer();
let (mut send, mut recv) = tokio::sync::mpsc::unbounded_channel::<Packet>();
let j = tokio::spawn(async move{
    let mut video_decoder = video_decoder.unwrap();
    let mut audio_decoder = audio_decoder.unwrap();
    let mut video_encoder = video_encoder.unwrap();
    let mut audio_encoder = audio_encoder.unwrap();

    while let Some(packet) = recv.recv().await {
        video_decoder.push(packet.clone());
        audio_decoder.push(packet.clone());
        while let Ok(Some(frame))=video_decoder.take() {
            video_encoder.push(frame);
        }
        while let Ok(Some(frame))=audio_decoder.take(){
            audio_encoder.push(frame);
        }
        while let (Ok(Some(audio_packet)),Ok(Some(video_packet))) = (audio_encoder.take(),video_encoder.take()) {
            muxer.push(video_packet.with_stream_index(video_index));
            muxer.push(audio_packet.with_stream_index(audio_index));
        }

    }
});

while let Ok(Some(packet)) = demuxer.take() {
    send.send(packet);
}
j.await;
Ok(())

}

[tokio::main]

async fn main() {

let (send,read) = std::sync::mpsc::channel::<Vec<u8>>();
thread::spawn(move||{
    let mut file = File::create("E://a.mp4").unwrap();
    let server = Server::bind("127.0.0.1:10000").unwrap();
    for request in server.filter_map(Result::ok) {
        match request.accept() {
            Ok(tcps) => {
                println!("websocket success");
                let (mut receiver, mut sender) = tcps.split().unwrap();
                for message in receiver.incoming_messages() {
                    let message = message.unwrap();
                    match message {
                        OwnedMessage::Close(_) => {
                            let message = OwnedMessage::Close(None);
                            sender.send_message(&message).unwrap();
                            file.flush();
                            return;
                        }
                        OwnedMessage::Ping(ping) => {
                            let message = OwnedMessage::Pong(ping);
                            sender.send_message(&message).unwrap();
                        }
                        _ => {
                            let mut payload = message.take_payload();
                            // let mut i = 0;
                            // file.write_all(payload.as_slice());
                            let i = 0;
                            if payload.len()<4096 {
                                send.send(payload);
                            }else {
                                let result: Vec<Vec<u8>> = payload.chunks(4096).map(|chunk| chunk.to_vec()).collect();
                                for x in result {
                                    send.send(x);
                                }
                            }
                        },
                    }
                }
            }
            Err(_) => {}
        }
    }
});
let input_stream = move |buf: &mut [u8]| -> std::io::Result<usize> {
    let vc = read.recv().unwrap();
    for i in 0..vc.len() {
        buf[i] = vc[i];
    }
    Ok(vc.len())
};
let mut reader:ClosureReader = ClosureReader::new(input_stream);
let output_stream = move |buf: &[u8]| -> std::io::Result<usize> {
    println!("out {:?}", &buf);
    Ok(buf.len())
};
let mut writer:ClosureWriter = ClosureWriter::new(output_stream);

if let Err(err) = convert(reader,writer).await {
    eprintln!("ERROR: {}", err);
}

}