pchpch11 / javacv

Automatically exported from code.google.com/p/javacv
GNU General Public License v2.0
0 stars 0 forks source link

FFmpegFrameRecorder has no support for audio #160

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
hi, samuel, 
    I am new to Android, android I want to use javacv/ffmpeg to encode camera data into .flv(h.264/aac), can javacv/ffmpeg be used in this way?
    Thank you very much!
What steps will reproduce the problem?
1.I have got data from Android camera(in onPreviewFrame(byte[] data, Camera 
camera))
2.I want to use JavaCV/FFmpeg to encode the data into flv(h.264/aac)
3.Can javacv/ffmpeg be used in this way?

What is the expected output? What do you see instead?
I just want to know that can javacv/ffmpeg be used in this way? And it would be 
better there is a demo, ha.

What version of the product are you using? On what operating system?
andriod 2.3, lenovo S2005A

Please provide any additional information below.
I would very much appreciate if you help me, thank you.

Original issue reported on code.google.com by zhangqia...@gmail.com on 27 Feb 2012 at 3:22

GoogleCodeExporter commented 9 years ago
I think there is something wrong with libx264.so which I have compiled, now
I am trying to recompile the libx264.so, if I succeed, I would shared my
experience.

Original comment by zhangqia...@gmail.com on 17 Apr 2012 at 2:48

GoogleCodeExporter commented 9 years ago
Samuel, I compiled x264 again, but it still seemed not OK. and this is the
details of how I compiled and linked x264:

1. download x264:
ftp://ftp.videolan.org/pub/x264/snapshots/last_x264.tar.bz2, and tar in
/home/xxx/
2. cd x264/, and type :

/configure --prefix=.. \
--cross-prefix=arm-linux-androideabi- \
--enable-shared \
--enable-pic \
--host=arm-linux

and:
make

then:
make install

then you will find libx264.so and libx264.so.122 in lib/, x264.h and
x264_config.h in include/:

put libx264.so and libx264.so.122 into
android-ndk-r5/platforms/android-8/arch-arm/usr/lib
put x264.h and x264_config.h
into android-ndk-r5/platforms/android-8/arch-arm/usr/include

android ffmpeg ./configure add:

PREBUILT=/home/liang/android-ndk-r5/toolchains/arm-eabi-4.4.0/prebuilt/linux-x86
PLATFORM=/home/liang/android-ndk-r5/platforms/android-8/arch-arm

--enable-libx264
add "-I$PLATFORM/usr/include" to --extra-cflags
add "-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib" to --extra-ldflags

and next step is to compile  ffmpeg, I believe it would be same as
others said online.

Original comment by zhangqia...@gmail.com on 17 Apr 2012 at 8:15

GoogleCodeExporter commented 9 years ago
What is not OK exactly? You're encoded FLV file contains visible video and 
audio data, as one would expect... ?

Original comment by samuel.a...@gmail.com on 17 Apr 2012 at 8:47

GoogleCodeExporter commented 9 years ago
the FLV file contains video and audio data, see the attach file, maybe we
should set the preset of x264, or I compile it again as your method.

Original comment by zhangqia...@gmail.com on 17 Apr 2012 at 9:01

GoogleCodeExporter commented 9 years ago
hi, Samuel, you said "built the included libraries by first applying the
included `ffmpeg-android-20120123.patch` to FFmpeg 0.7.11", I have  never
used Fedora before, May I ask how to do it? thanks.

Original comment by zhangqia...@gmail.com on 18 Apr 2012 at 6:39

GoogleCodeExporter commented 9 years ago
This is just for reference. It should work just fine on any Linux distribution.

So, what exactly is wrong with the FLV file you attached? I don't see any major 
problem, except that some frames are missing...

Original comment by samuel.a...@gmail.com on 18 Apr 2012 at 6:42

GoogleCodeExporter commented 9 years ago
yes, that's the problem, and the audio seems a little fast, now it confuses me, 
and I don't know I do solve it.

Original comment by zhangqia...@gmail.com on 18 Apr 2012 at 6:47

GoogleCodeExporter commented 9 years ago
Is it possible that I lost something configure during compiling?

Original comment by zhangqia...@gmail.com on 18 Apr 2012 at 6:51

GoogleCodeExporter commented 9 years ago
No, like I said, I think it's just your CPU that is too slow to encode in real 
time... ARM CPUs are (currently) much slower than x86 CPUs. I think you managed 
to compile everything just fine.

Original comment by samuel.a...@gmail.com on 18 Apr 2012 at 6:54

GoogleCodeExporter commented 9 years ago
wow, it does be a tough problem now, is it possible about the the preset of 
x264?(forgive me asking the simple question, cause I am a rookie of java, 
android and stream media)

Original comment by zhangqia...@gmail.com on 18 Apr 2012 at 7:06

GoogleCodeExporter commented 9 years ago
Of course there are settings to encode faster, with lesser quality, etc. There 
are a lot of settings, so take the time to read x264's documentation, blogs, 
etc.

Original comment by samuel.a...@gmail.com on 18 Apr 2012 at 7:09

GoogleCodeExporter commented 9 years ago
hi, Samuel,
I set:
video_pts = (double) video_st.pts().val() * video_st.time_base().num() /
video_st.time_base().den();
Log.d(LOG_TAG, "video_st.pts()" + video_st.pts());

and Logcat print out"video_pts: -9.223372036854776E15", and it is same all
the time. so, what should video_c.time_base set with CODE_ID_H264?

Original comment by zhangqia...@gmail.com on 19 Apr 2012 at 2:03

GoogleCodeExporter commented 9 years ago
Hi, Samuel,

video_outbuf_size = imageWidth * imageHeight * 4; // ??

what is it on the basis of?

Original comment by zhangqia...@gmail.com on 20 Apr 2012 at 2:30

GoogleCodeExporter commented 9 years ago
I have no idea what should be done to support x264 properly, so if you figure 
it out, please be sure to let me know!

As for video_outbuf_size, I just figured that a compressed frame would not get 
bigger than an uncompressed 8-bit RGBA image, the largest pixel format 
supported... What should it be?

Original comment by samuel.a...@gmail.com on 22 Apr 2012 at 2:30

GoogleCodeExporter commented 9 years ago
Now I can encode flv(h264/aac), but the audio was not quite right, see attached 
file. Maybe you can list out your compiling method, and I compare it with mine. 
And I can change some configure or Makefile to match javacv, do you think this 
way could help?

Original comment by zhangqia...@gmail.com on 23 Apr 2012 at 2:32

Attachments:

GoogleCodeExporter commented 9 years ago
The configuration I use is all inside this package:
http://code.google.com/p/javacv/downloads/detail?name=ffmpeg-0.7.11-android-arm.
zip

Original comment by samuel.a...@gmail.com on 23 Apr 2012 at 12:54

GoogleCodeExporter commented 9 years ago
Reopening as we have not yet properly handled H.264 and audio, either on 
Android or on any other platform...

Original comment by samuel.a...@gmail.com on 2 Jun 2012 at 5:00

GoogleCodeExporter commented 9 years ago
Issue 204 has been merged into this issue.

Original comment by samuel.a...@gmail.com on 2 Jun 2012 at 5:01

GoogleCodeExporter commented 9 years ago
Hello!
What have you done to be able to record h264 video? Do you have an example or 
something?
Thanks!

Original comment by Jkolo...@gmail.com on 2 Jun 2012 at 5:10

GoogleCodeExporter commented 9 years ago
This looks like a solution
http://libav-users.943685.n4.nabble.com/libx264-xxx-non-strictly-monotonic-PTS-t
d3275701.html
But someone should try to implement it in javacv.

Original comment by Jkolo...@gmail.com on 2 Jun 2012 at 6:06

GoogleCodeExporter commented 9 years ago
Doesn't seem to work with H.263 either. This doesn't produce a playable file:
        IplImage image = cvLoadImage("lena.jpg");
        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("lena.3gp", 512, 512);
        recorder.setCodecID(CODEC_ID_H263);
        recorder.setFormat("3gp");
        recorder.setPixelFormat(PIX_FMT_YUV420P);
        recorder.start();
        for (int i = 0; i < 100; i++) {
            recorder.record(image);
        }
        recorder.stop();
If anyone knows the solution, please let us know.

Original comment by samuel.a...@gmail.com on 4 Jun 2012 at 12:34

GoogleCodeExporter commented 9 years ago
Hello, thanks for your great work guys! I would like to know about audio merge 
in the video. Someone got some advance?

Original comment by igor...@gmail.com on 6 Jun 2012 at 8:35

GoogleCodeExporter commented 9 years ago
It could work with H.264, and you should compile x264 into ffmpeg first, and 
the method as mention in Comment 52.

Original comment by zhangqia...@gmail.com on 11 Jun 2012 at 2:25

GoogleCodeExporter commented 9 years ago
Without compiling x264 into ffmpeg, it also works with h.263, and the CodecID 
should be CODEC_ID_FLV1

Original comment by zhangqia...@gmail.com on 11 Jun 2012 at 2:27

GoogleCodeExporter commented 9 years ago
Hi Guys!,
   I'm trying to do a image + audio video encode. I've downloaded the "fujiaoji" sources (Comment 39) for audio supports. But my application crash, below the log after call start method:

06-11 11:25:16.465: I/Compress(7185): start begin
06-11 11:25:16.490: I/Compress(7185): video start finished
06-11 11:25:16.490: I/Compress(7185): audio start finished
06-11 11:25:16.510: I/Compress(7185): find video encoder
06-11 11:25:16.510: I/Compress(7185): encoder not find

Part of code:
recorder.setCodecVideoID(avcodec.CODEC_ID_FLV1);
recorder.setCodecAudioID(avcodec.CODEC_ID_AAC);
recorder.setFormat("flv");
recorder.setPixelFormat(avutil.PIX_FMT_YUV420P);
recorder.setFrameRate(20);
recorder.start();

Thanks,
  Marcelo Alves

Original comment by malves.i...@gmail.com on 11 Jun 2012 at 2:45

GoogleCodeExporter commented 9 years ago
Hi Samuel, 

Regarding to the "video_outbuf_size = imageWidth*imageHeight*4;"  in the 
FFmpegFrameRecorder. I think there is actually a lower limit (~16400) for this 
buffer size. If I try to encode a video with dimension < 4100 pixel^2 (w x h), 
then it always complains about "buffer smaller than minimum size" (I am using 
mpeg4 codec). 
If I tweak this video_outbuf_size to imageWidth*imageHeight*10 (or *100) , then 
this video lower dimension limit no longer exist.
I suggest to add a lower limit for this  "video_outbuf_size" just in case that 
imageWidth*imageHeight is too small and the buffer allocated is not reaching 
the minimum requirement for encoding. 

Original comment by Qzts...@gmail.com on 11 Jun 2012 at 2:46

GoogleCodeExporter commented 9 years ago
I think the real problem happens into FFmpegFrameRecorder line 321. 
Message error: 06-11 15:04:55.180: A/libc(11826): Fatal signal 11 (SIGSEGV) at 
0xfffffffd (code=1)

Original comment by malves.i...@gmail.com on 11 Jun 2012 at 6:11

GoogleCodeExporter commented 9 years ago
@Qztseng So, what is that lower limit? I can put 10, but it doesn't mean it 
will always work either

Original comment by samuel.a...@gmail.com on 12 Jun 2012 at 1:09

GoogleCodeExporter commented 9 years ago
@Samuel
I think the lower limit is around 16400. I currently modified it to check if 
the width x height x 4 > than 20000, if not, use 20000 instead of w x h x 4. 

Original comment by Qzts...@gmail.com on 13 Jun 2012 at 10:47

GoogleCodeExporter commented 9 years ago
@Jkolobok I think I fixed H.264 encoding, please try the code from the 
repository:
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FrameRecord
er.java

@Qztseng Thanks, I'll be updating that too

Original comment by samuel.a...@gmail.com on 17 Jun 2012 at 3:28

GoogleCodeExporter commented 9 years ago
Oops, that would be:
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FFmpegFrame
Recorder.java

Original comment by samuel.a...@gmail.com on 17 Jun 2012 at 3:29

GoogleCodeExporter commented 9 years ago
It looks like H.263 has been working all along, but apparently it only supports 
a limited set of resolution (128 x 96, 176 x 144, 352 x 288, 704 x 576, and 
1408 x 1152), so I hardcoded that. I also included better defaults for common 
3gp and mp4 formats such that calls like these do not fail by default anymore:
    new FFmpegFrameRecorder("file.3gp", width, height).start();
    new FFmpegFrameRecorder("file.mp4", width, height).start();

So, about audio, if anyone wants to contribute working audio support for 
FFmpegFrameRecorder, please post your patches here, thanks!

Original comment by samuel.a...@gmail.com on 17 Jun 2012 at 12:12

GoogleCodeExporter commented 9 years ago
Here is my code, encoding both video and audio:
VIDEO_CODE_ID:CODEC_ID_FLV1;
AUDIO_CODE_ID:CODEC_ID_ACC;

Original comment by zhangqia...@gmail.com on 20 Jun 2012 at 2:27

Attachments:

GoogleCodeExporter commented 9 years ago
Great, thanks! If it works, I'll put it in. BTW, have you fixed your issue with 
libx264? There's a couple of things I figured out were required by libx264 to 
work properly, so it should now work alright, even on Android.

Original comment by samuel.a...@gmail.com on 20 Jun 2012 at 2:35

GoogleCodeExporter commented 9 years ago
Yes, I did. But first, we should compile x246 into ffmpeg, here is my code to 
encode h.264/aac:

Original comment by zhangqia...@gmail.com on 20 Jun 2012 at 2:40

Attachments:

GoogleCodeExporter commented 9 years ago
Would you also be able to make a version that minimizes the changes to the 
source code, without adding anything that runs only on Android 
(android.util.Log at least), but also on Java SE? thank you

Original comment by samuel.a...@gmail.com on 20 Jun 2012 at 12:34

GoogleCodeExporter commented 9 years ago
Ok, I've added audio support to `FFmpegFrameRecorder` (`setAudioChannels(int)` 
for int > 0 and `record(Buffer)` alongside `record(IplImage)`)! Please try 
these new source files: 
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FrameRecord
er.java
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FFmpegFrame
Recorder.java
And let me know if you encounter any problems, thanks!

Original comment by samuel.a...@gmail.com on 25 Jun 2012 at 2:41

GoogleCodeExporter commented 9 years ago
Re @87

How do you test audio? Is there some sample code?
Should we call recorder.record(image) and rec.record(buffer) separaterly?
How do you get create buffer with Audio data?

Original comment by marko.ko...@gmail.com on 25 Jun 2012 at 2:55

GoogleCodeExporter commented 9 years ago
Re @87
We should compare video_st to audio_st, for deciding when to write video data 
and when to write audio data, to insure the synchronization of video and audio.

Original comment by zhangqia...@gmail.com on 26 Jun 2012 at 1:34

GoogleCodeExporter commented 9 years ago
@89
it should be audio_pts and video_pts

Original comment by zhangqia...@gmail.com on 26 Jun 2012 at 1:35

GoogleCodeExporter commented 9 years ago
Marco, yes, that's what I said. More explicitly, something like:
    FrameRecorder recorder = new FFmpegFrameRecorder("filename.avi", width, height, audioChannels);
    recorder.start();
    // in a loop somewhere
    recorder.record(image);
    recorder.record(audioSamples);
As for the java.nio.Buffer class, it's all here:
    http://docs.oracle.com/javase/6/docs/api/java/nio/Buffer.html

Zhang, no we don't need to, av_interleaved_write_frame() takes care of that:
    http://ffmpeg.org/doxygen/0.6/avformat_8h.html#37352ed2c63493c38219d935e71db6c1
But obviously this only works if we actually send `frameRate` frames per 
second...

Original comment by samuel.a...@gmail.com on 26 Jun 2012 at 2:43

GoogleCodeExporter commented 9 years ago
Ah, here's some sample code, copy/paste the example from this tutorial:
    http://docs.oracle.com/javase/tutorial/sound/converters.html
And replace 
    // Here, do something useful with the audio data that's 
    // now in the audioBytes array...
by
    recorder.record(someImage); // called according to the frameRate
    recorder.record(ByteBuffer.wrap(audioBytes, 0, numBytesRead));

Original comment by samuel.a...@gmail.com on 26 Jun 2012 at 10:17

GoogleCodeExporter commented 9 years ago
Ok, I've also added audio support to FFmpegFrameGrabber, so we can now use them 
together to test them out, or to use them as a poor man's transcoder, with 
something like this:

        FrameGrabber grabber = new FFmpegFrameGrabber("input.avi");
        grabber.start();
        FrameRecorder recorder = new FFmpegFrameRecorder("output.mp4", grabber.getImageWidth(), grabber.getImageHeight(), grabber.getAudioChannels());
        recorder.setFrameRate(grabber.getFrameRate());
        recorder.setSampleFormat(grabber.getSampleFormat());
        recorder.setSampleRate(grabber.getSampleRate());
        recorder.start();
        Frame frame;
        while ((frame = grabber.grabFrame()) != null) {
            recorder.record(frame);
        }
        recorder.stop();
        grabber.stop();

We need these updated classes to try it out:
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/Frame.java
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FrameGrabbe
r.java
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FrameRecord
er.java
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FFmpegFrame
Grabber.java
http://javacv.googlecode.com/git/src/main/java/com/googlecode/javacv/FFmpegFrame
Recorder.java

As usual, let me know of any issues! thanks

Original comment by samuel.a...@gmail.com on 28 Jun 2012 at 1:49

GoogleCodeExporter commented 9 years ago
Actually, I'm trying to test audio on Android, but for some reason can't make 
AudioManager to be properly initialized. Whatever combination of source, sample 
rate, format and buffer size I try, it fails to initialize, in both emulator 
and phone. Has anyone tryed that?

        int audioSource = MediaRecorder.AudioSource.DEFAULT;
        int[] sampleRates = {44100, 22050, 11025, 8000};
        // int sampleRateInHz = 44100;
        int channelConfig = AudioFormat.CHANNEL_IN_MONO;
        int audioFormat = AudioFormat. ENCODING_PCM_16BIT;

        int i = 0;
        do {
            int sampleRateInHz = sampleRates[i];
            bufferSizeInBytes = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, audioFormat);
            if (bufferSizeInBytes < 1) bufferSizeInBytes = 65536;
            audioBuffer = new byte[bufferSizeInBytes];
            audioRecord = new AudioRecord(audioSource, sampleRateInHz, channelConfig, audioFormat, bufferSizeInBytes);
        } while ((++i<sampleRates.length) & !( audioRecord.getState() == AudioRecord.STATE_INITIALIZED));

Original comment by marko.ko...@gmail.com on 2 Jul 2012 at 8:46

GoogleCodeExporter commented 9 years ago
Here is my recording method, and it works well:

public class CameraView extends SurfaceView implements SurfaceHolder.Callback, 
PreviewCallback {

    private static final String LOG_TAG = "CameraView & AudioView";

    private SurfaceHolder mHolder;
    private Camera mCamera;
    private boolean isPreviewOn = false;

    /* video data */
    private int imageWidth;
    private int imageHeight;
    private int frameRate;
    private int sampleVideoBitRate;
    private IplImage yuvIplimage;

    /* audio data */
    private boolean isAudioRecording = false;
    private AudioRecord audioRecord;
    private Thread audioView;
    private int sampleAudioRateInHz;
    private int sampleAudioBitRate;

    /* recorder */
    private volatile boolean isRecorderStart = false;
    private FFmpegRecorder recorder;
    private String link;

    /**
     * setup the CameraView and the AudioThread, and initalize the recorder
     * @param context android.content.Context
     * @param camera hardware.Camera
     * @param width width of Camera's size 
     * @param height height of Camera's size 
     * @param frame frame rate of camera preview, and for ffmpeg encoding
     * @param videoRate video bit rate for ffmpeg encoding
     * @param audioRateInHz audio rate for sample and ffmpeg encoding
     * @param audioBitRate audio bit rate for sample and ffmpeg encoding
     * @param url ffmpeg server connecting link
     */
    public CameraView(Context context, Camera camera, int width, int height, int frame, int videoRate,
            int audioRateInHz, int audioBitRate, String url) {
        super(context);

        this.mCamera = camera;
        this.imageWidth = width;
        this.imageHeight = height;
        this.frameRate = frame;

        this.sampleVideoBitRate = videoRate;
        this.sampleAudioRateInHz = audioRateInHz;
        this.sampleAudioBitRate = audioBitRate;
        this.link = url;

        mHolder = getHolder();
        mHolder.addCallback(CameraView.this);
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        mCamera.setPreviewCallback(CameraView.this);

        if (Register.isRegistered(context)) {

            recorder = new FFmpegRecorder(link, imageWidth, imageHeight, sampleVideoBitRate, 
                    frameRate, sampleAudioRateInHz, sampleAudioBitRate);

            audioView = new Thread(new AudioRecordThread());
            audioView.start();
        } else {
            System.exit(0);
        }

    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        try {
            stopPreview();
            mCamera.setPreviewDisplay(holder);
        } catch (IOException exception) {
            mCamera.release();
            mCamera = null;
        }
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        Camera.Parameters camParams = mCamera.getParameters();
        camParams.setPreviewSize(imageWidth, imageHeight);
        camParams.setPreviewFrameRate(frameRate);
        mCamera.setParameters(camParams);
        startPreview();
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mHolder.addCallback(null);
        mCamera.setPreviewCallback(null);

        isAudioRecording = false;
    }

    public synchronized void startPreview() {
        if (!isPreviewOn && mCamera != null) {
            isPreviewOn = true;
            mCamera.startPreview();
        }
    }

    public synchronized void stopPreview() {
        if (isPreviewOn && mCamera != null) {
            isPreviewOn = false;
            mCamera.stopPreview();
        }
    }

    @Override
    public synchronized void onPreviewFrame(byte[] data, Camera camera) {
        try {
            if (yuvIplimage == null) {
                yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
                Log.i(LOG_TAG, "create yuvIplimage");
            }

            /* get video data, yuvIplimage put data */
            if (yuvIplimage != null && isRecorderStart) {
                yuvIplimage.getByteBuffer().put(data);
            }
        } catch (Exception e) { 
        }
    }

    /**
     * thread for getting audio data
     * @author Zhang Qianliang
     *
     */
    public class AudioRecordThread implements Runnable {
        @Override
        public void run() {
            int bufferLength = 0;
            int bufferSize;
            short[] audioData;
            int bufferReadResult;

            try {
                bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);

                if (bufferSize <= 2048) {
                    bufferLength = 2048;
                } else if (bufferSize <= 4096) {
                    bufferLength = 4096;
                }

                /* set audio recorder parameters, and start recording */
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferLength);
                audioData = new short[bufferLength];
                audioRecord.startRecording();
                Log.d(LOG_TAG, "audioRecord.startRecording()");

                isAudioRecording = true;

                /* ffmpeg_audio encoding loop */
                while (isAudioRecording) {
                    bufferReadResult = audioRecord.read(audioData, 0, audioData.length);

                    /* recorder.record(): bufferReadResult == 1024 */
                    if (bufferReadResult == 1024 && isRecorderStart) {
                        short[] realAudioData1024 = new short[bufferReadResult];
                        System.arraycopy(audioData, 0, realAudioData1024, 0, bufferReadResult);

                        recorder.recording(yuvIplimage, realAudioData1024);
                    } 

                    /* recorder.record(): bufferReadResult == 2048 */
                    else if (bufferReadResult == 2048 && isRecorderStart) {
                        short[] realAudioData2048_1 = new short[1024];
                        short[] realAudioData2048_2 = new short[1024];
                        System.arraycopy(audioData, 0, realAudioData2048_1, 0, 1024);
                        System.arraycopy(audioData, 1024, realAudioData2048_2, 0, 1024);

                        for (int i = 0; i < 2; i++) {
                            if (i == 0) {
                                recorder.recording(yuvIplimage, realAudioData2048_1);
                            } else if (i == 1) {
                                recorder.recording(yuvIplimage, realAudioData2048_2);
                            }
                        }
                    }
                }

                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    try {
                        audioRecord.stop();
                        audioRecord.release();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                    audioRecord = null;
                }

                if (recorder != null && isRecorderStart) {
                    try {
                        stop();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                    recorder = null;
                }
            } catch (Exception e) {
                Log.e(LOG_TAG, "get audio data failed");
            }
        }
    }

    /**
     * judge if recorder started or not 
     * @return true--> started, false--> not started
     */
    public boolean isRecorderStart() {
        return isRecorderStart;
    }

    /**
     * to start or stop recording
     * @param status ture-->to start, false-->to stop
     */
    public void setRecorderStatus(boolean status) {
        if (!this.isRecorderStart) {
            start();
        }
        this.isRecorderStart = status;

    }

    /**
     * start recorder
     */
    private void start() {
        recorder.start();
    }

    /**
     * stop recorder
     */
    private void stop() {
        recorder.stop();
        recorder.release();
    }
}

Original comment by zhangqia...@gmail.com on 3 Jul 2012 at 1:57

GoogleCodeExporter commented 9 years ago
In this example, where is FFmpegRecorder defined?

Original comment by mik...@gmail.com on 3 Jul 2012 at 6:19

GoogleCodeExporter commented 9 years ago
@96 

/* recorder */
    private volatile boolean isRecorderStart = false;
    private FFmpegRecorder recorder;
    private String link;

Original comment by yonyf...@gmail.com on 9 Jul 2012 at 3:54

GoogleCodeExporter commented 9 years ago
JavaCV 0.2 now comes with all relevant changes to support H.264 and audio with 
FFmpeg 0.11.1. Let me know that it works well, thanks!

Original comment by samuel.a...@gmail.com on 22 Jul 2012 at 5:37

GoogleCodeExporter commented 9 years ago
think he meant where u defined that class as is not part of javacv. Can you 
post ur FFmpegRecorder class please? or full project for the camera stuff will 
be good ;)

Original comment by florin.m...@gmail.com on 22 Jul 2012 at 7:04

GoogleCodeExporter commented 9 years ago
a demo to record stream(CODEC_ID_FLV1/CODEC_ID_AAC):

Original comment by zhangqia...@gmail.com on 24 Jul 2012 at 9:15

Attachments: