Open FunnyDevs opened 2 years ago
Sure, it may not be that straightforward, but I'm sure there's a way to do it with the FFmpeg API.
/cc @bradh
I think i should "packetize" every bytebuffer as a Frame object, but i don't know if this is the right way or i need another approach
What exactly is in those ByteBuffer? You'll need to understand that before someone can help you.
These bytebuffers have encoded data. To understand better, i'm trying to edit this library code to mux audio and video data without Android muxer but with another muxer as FFmpegFrameRecorder could be.
In this method there is the muxing of encoded data https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/4b85bce8475deb97380dad3bbe51aca6d24087ea/rtplibrary/src/main/java/com/pedro/rtplibrary/util/RecordController.java#L155
Ok, i'm trying with a combination of framegrabber and framerecorder (now only video data). When i use framegrabber i notice 2 problems:
1) I have a surfacepreview where camera preview is displayed. When i start the grabber/recorder action, the ui freezes after 5 seconds (for 3-4 seconds) and then it lags --> the resulted video has 5 seconds fine but the remaining seconds seem accelerated.
2) I would like to simulate "-vcodev copy" but if i don't use
recorder.setVideoCodec(AV_CODEC_ID_H264);
the recorded video hasn't h264 data (so the video has a bad quality). How could i copy the video codec without reencoding with "setVideoCodec"?
Thank you
grabPacket() with start(context) and recordPacket() can be used for copy operations like that.
Ok, with grabpacket() and recordpacket() it works for video data, thank you.
Does it work also for aac encoded data? Because i have added another FFmpegFrameGrabber to get audio AVPacket and add it to the same recorder but i obtain this error.
av_write_frame() error -22 while writing video packet
This is my code
pipedVideoInputStream = new PipedInputStream();
pipedVideoOutputStream = new PipedOutputStream((PipedInputStream) pipedVideoInputStream);
pipedAudioInputStream = new PipedInputStream();
pipedAudioOutputStream = new PipedOutputStream((PipedInputStream) pipedAudioInputStream);
videoGrabber = new FFmpegFrameGrabber(pipedVideoInputStream,0);
audioGrabber = new FFmpegFrameGrabber(pipedAudioInputStream,0);
file.createNewFile();
recorder.setSampleRate(videoGrabber.getSampleRate());
recorder.setFrameRate(videoGrabber.getFrameRate());
recorder.setVideoBitrate(videoGrabber.getVideoBitrate());
recorder.setVideoCodec(videoGrabber.getVideoCodec());
recorder.setAudioChannels(videoGrabber.getAudioChannels());
new Thread(new Runnable() {
@Override
public void run() {
try {
videoGrabber.start();
audioGrabber.start();
recorder.start(videoGrabber.getFormatContext());
long dts = 0;
long pts = 0;
int timebase = 0;
double framerate = videoGrabber.getFrameRate();
long lasttime = System.currentTimeMillis();
int pktindex = 0;
int err_index = -1;
int exitcode = 0;
AVPacket packet;
while (record){
packet = videoGrabber.grabPacket();
if (packet == null)
continue;
if (packet.stream_index() == 1) {
av_packet_unref(packet);
continue;
}
packet.pts(pts);
packet.dts(dts);
timebase = videoGrabber.getFormatContext().streams(packet.stream_index()).time_base().den();
pts += timebase / (int) framerate;
dts += timebase / (int) framerate;
recorder.recordPacket(packet);
packet = audioGrabber.grabPacket();
recorder.recordPacket(packet);
}
}
catch (Throwable t){
t.printStackTrace();
}
}
}).start();
I don't think it supports more than a single context, no, that sounds like an enhancement. If you're willing to help, please do!
Hi! I'm using Android API to record audio and video and then muxing them. But i don't want to use the Android MediaMuxer API to mux the incoming bytebuffers. Could i use FFmpegFrameRecorder to mux these data into a mp4 file?