361809948 / javacv

Automatically exported from code.google.com/p/javacv
GNU General Public License v2.0
0 stars 0 forks source link

FFmpegFrameRecorder has no support for audio #160

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
hi, samuel, 
    I am new to Android, android I want to use javacv/ffmpeg to encode camera data into .flv(h.264/aac), can javacv/ffmpeg be used in this way?
    Thank you very much!
What steps will reproduce the problem?
1.I have got data from Android camera(in onPreviewFrame(byte[] data, Camera 
camera))
2.I want to use JavaCV/FFmpeg to encode the data into flv(h.264/aac)
3.Can javacv/ffmpeg be used in this way?

What is the expected output? What do you see instead?
I just want to know that can javacv/ffmpeg be used in this way? And it would be 
better there is a demo, ha.

What version of the product are you using? On what operating system?
andriod 2.3, lenovo S2005A

Please provide any additional information below.
I would very much appreciate if you help me, thank you.

Original issue reported on code.google.com by zhangqia...@gmail.com on 27 Feb 2012 at 3:22

GoogleCodeExporter commented 9 years ago
and the permission:
    <uses-sdk android:minSdkVersion="8" />
    <uses-permission android:name="android.permission.WAKE_LOCK" />
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.RECORD_AUDIO"/>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />

Original comment by zhangqia...@gmail.com on 24 Jul 2012 at 9:16

GoogleCodeExporter commented 9 years ago
Hi Zhang , 

I have audio file and video file separately . In this case how to merge both of 
them ... please advice me .. 

Original comment by itsrajes...@gmail.com on 24 Jul 2012 at 1:40

GoogleCodeExporter commented 9 years ago
On the 0.02 release, your code is not working. On a 0.01 release when i try to 
publish to a rtmp, not working (server saying invalid format).
I've attempted to make your code work for 0.02 version (record to memory first) 
and now i get 
07-27 16:40:25.063: W/System.err(6349): 
com.googlecode.javacv.FrameRecorder$Exception: Could not open audio codec
Is AAC not in last ffmpeg published in download section?
Zhang can you try to publish the camera successful to a wowza server?

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 12:47

GoogleCodeExporter commented 9 years ago
package com.camera_test;

import static com.googlecode.javacv.cpp.avcodec.*;
import static com.googlecode.javacv.cpp.avutil.*;
import static com.googlecode.javacv.cpp.opencv_core.*;
import static com.googlecode.javacv.cpp.opencv_core.IplImage;
import com.googlecode.javacv.FFmpegFrameRecorder;
import static com.googlecode.javacv.FrameRecorder.Exception;

import java.io.File;
import java.io.IOException;

import java.nio.ByteBuffer;

import android.app.Activity;
import android.content.Context;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.RelativeLayout;
import android.widget.Toast;

public class RecordActivity extends Activity {
    private final static String LOG_TAG = "RecordActivity";
    private PowerManager.WakeLock mWakeLock;

    private boolean isAudioRecording = false;

    private File videoFile;
    private String fileStr = "/mnt/sdcard/stream2.flv";
    private String ffmpeg_link;
    public volatile boolean isRecorderStart = false;
    public volatile boolean isVideoStart = false;

    private volatile FFmpegFrameRecorder recorder;

    /* the parameter of ffmpeg setting */
    private final static int sampleAudioRateInHz = 11025;
    private final static int sampleAudioBitRate = 32000;
    private final static int imageWidth = 176;
    private final static int imageHeight = 144;
    private final static int frameRate = 8;
    private final static int sampleVideoBitRate = 200000;

    /* audio data getting thread */
    private AudioRecord audioRecord;
    private Thread audioView;

    /* video data getting thread */
    private Camera cameraDevice;
    private CameraView cameraView;
    private boolean isPreviewOn = false;
    private IplImage yuvIplimage = null;

    /* layout setting */
    private final int bg_screen_bx = 232;
    private final int bg_screen_by = 128;
    private final int bg_screen_width = 700;
    private final int bg_screen_height = 500;
    private final int bg_width = 1123;
    private final int bg_height = 715;
    private final int live_width = 640;
    private final int live_height = 480;
    private int screenWidth, screenHeight;
    private Button btnRecorderControl;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.record_layout);
        Log.w("camera","PM stuff");
        /* manager - keep phone waking up */
        PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
        mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST"); 
        mWakeLock.acquire(); 
        Log.w("camera","PM stuff done");

        initLayout();

        Log.w("camera","after init layout");
        videoFile=new File(fileStr);
        if (!videoFile.exists()) {
            try {   
                videoFile.createNewFile();
                Log.i(LOG_TAG, "create videoFile success");
            } catch (IOException e) {
                Log.e(LOG_TAG, "create videoFile failed");
            }
        } else {
            Log.i(LOG_TAG, "videoFile exited already");
        }

        ffmpeg_link = videoFile.getAbsolutePath();
        //ffmpeg_link="rtmp://88.208.200.115:1935/live/15/memberCam";

        initRecorder();
    }

    @Override
    protected void onResume() {
        super.onResume();

        if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST");
           mWakeLock.acquire();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();

        isAudioRecording = false;

        if (cameraView != null) {
            cameraView.stopPreview();       
            cameraDevice.release();
            cameraDevice = null;
        }

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
        isRecorderStart = false;
    }

    //---------------------------------------
    // initialize layout   
    //---------------------------------------
    private void initLayout() {

        /* get size of screen */
        Log.w("camera","1");
        Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
        Log.w("camera","2");
        screenWidth = display.getWidth();
        screenHeight = display.getHeight();
        Log.w("camera","4");
        RelativeLayout.LayoutParams layoutParam = null; 
        Log.w("camera","5");
        LayoutInflater myInflate = null; 
        Log.w("camera","6");
        myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
        Log.w("camera","7");
        RelativeLayout topLayout = new RelativeLayout(this);
        Log.w("camera","8");
        setContentView(topLayout);
        Log.w("camera","8");
        RelativeLayout preViewLayout = (RelativeLayout) myInflate.inflate(R.layout.record_layout, null);
        Log.w("camera","9");
        layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
        Log.w("camera","10");
        topLayout.addView(preViewLayout, layoutParam);
        Log.w("camera","11");

        /* add control button: start and stop */
        btnRecorderControl = (Button) findViewById(R.id.recorder_control);
        Log.w("camera","12");
        btnRecorderControl.setOnClickListener(mControlAction);
        Log.w("camera","13");

        /* add camera view */
        int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
        int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
        int prev_rw, prev_rh;
        if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
            prev_rh = display_height_d;
            prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
        } else {
            prev_rw = display_width_d;
            prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
        }
        layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
        layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
        layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

        cameraDevice = Camera.open();
        Log.i(LOG_TAG, "cameara open");
        //Log.w(LOG_TAG,"many cam:"+String.valueOf(cameraDevice.getNumberOfCameras()));
        cameraView = new CameraView(this, cameraDevice);
        topLayout.addView(cameraView, layoutParam);
        Log.i(LOG_TAG, "cameara preview start: OK");

    }

    //---------------------------------------
    // recorder control button 
    //---------------------------------------
    private OnClickListener mControlAction = new OnClickListener() {
        @Override
        public void onClick(View v) {

            if (!isRecorderStart) {
                try {
                    recorder.start();
                } catch (Exception e) {
                    // TODO Auto-generated catch block
                    e.printStackTrace();
                }
                isRecorderStart = true;
                Log.w("camera","btn stop");
                btnRecorderControl.setBackgroundResource(R.drawable.btn_record_stop);
            } else if (isRecorderStart) {
                isRecorderStart = false;
                Log.w("camera","btn start");
                btnRecorderControl.setBackgroundResource(R.drawable.btn_record_start);
            }
        }
    };

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {

        if (keyCode == KeyEvent.KEYCODE_BACK) {
            if (isRecorderStart) {
                Toast.makeText(RecordActivity.this, R.string.before_finish, 1000).show();
            } else {
                RecordActivity.this.finish();
            }
            return true;
        }

        return super.onKeyDown(keyCode, event);
    }

    //---------------------------------------
    // initialize ffmpeg_recorder
    //---------------------------------------
    private void initRecorder() {

        //--------------------------------------
        // get intent data: ffmpeg link
        //--------------------------------------
        Log.w("camera","init recorder");
        Bundle bundle = new Bundle();
        bundle = this.getIntent().getExtras();

        if (bundle != null) {
            ffmpeg_link = bundle.getString("link_url");
        }

        if (yuvIplimage == null) {
            yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
            Log.i(LOG_TAG, "create yuvIplimage");
        }

        Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
        recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight,1);
//      AVCodec codec;

        recorder.setAudioCodec(CODEC_ID_AAC);

        //recorder.setAudioSampleRate(sampleAudioRateInHz);
        recorder.setAudioBitrate(sampleAudioBitRate);
        recorder.setAudioChannels(1);
        recorder.setVideoCodec(CODEC_ID_FLV1);
        //recorder.setCodecVideoID(CODEC_ID_FLV1);
        recorder.setFrameRate(frameRate);
        recorder.setVideoBitrate(sampleVideoBitRate);

        recorder.setPixelFormat(PIX_FMT_YUV420P);
        recorder.setFormat("flv");

        Log.i(LOG_TAG, "recorder initialize success");

        audioView = new Thread(new AudioRecordThread());
        audioView.start();
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordThread implements Runnable {
        @Override
        public void run() {
            int bufferLength = 0;
            int bufferSize;
            byte[] audioData;
            int bufferReadResult;

            try {
                bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);

                if (bufferSize <= 2048) {
                    bufferLength = 2048;
                } else if (bufferSize <= 4096) {
                    bufferLength = 4096;
                }

                /* set audio recorder parameters, and start recording */
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferLength);
                audioData = new byte[bufferLength];
                audioRecord.startRecording();
                Log.d(LOG_TAG, "audioRecord.startRecording()");

                isAudioRecording = true;

                /* ffmpeg_audio encoding loop */
                while (isAudioRecording) {
                    bufferReadResult = audioRecord.read(audioData, 0, audioData.length);

                    if (bufferReadResult == 1024 && isRecorderStart) {
                        byte[] realAudioData1024 = new byte[bufferReadResult];
                        System.arraycopy(audioData, 0, realAudioData1024, 0, bufferReadResult);

                        recorder.record(yuvIplimage);
                        recorder.record(ByteBuffer.wrap(realAudioData1024, 0, 1024) );
                        //recorder.record(realAudioData1024);
                        Log.d(LOG_TAG, "recorder.record(): bufferReadResult == 1024");

                    } else if (bufferReadResult == 2048 && isRecorderStart) {
                        byte[] realAudioData2048_1 = new byte[1024];
                        byte[] realAudioData2048_2 = new byte[1024];

                        System.arraycopy(audioData, 0, realAudioData2048_1, 0, 1024);
                        System.arraycopy(audioData, 1024, realAudioData2048_2, 0, 1024);

                        for (int i = 0; i < 2; i++) {
                            if (i == 0) {
                                recorder.record(yuvIplimage);//, 
                                recorder.record(ByteBuffer.wrap(realAudioData2048_1, 0, 1024) );

                            } else if (i == 1) {
                                recorder.record(yuvIplimage);//, 
                                recorder.record(ByteBuffer.wrap(realAudioData2048_2, 0, 1024) );
                            }
                        }
                        Log.d(LOG_TAG, "recorder.record(): bufferReadResult == 2048");
                    }
                }

                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    audioRecord.stop();
                    audioRecord.release();
                    audioRecord = null;
                }

                if (recorder != null && isRecorderStart) {
                    try {
                        recorder.stop();
                        recorder.release();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                    recorder = null;
                }
            } catch (Exception e) {
                Log.e(LOG_TAG, "get audio data failed");
            }

        }
    }

    //---------------------------------------------
    // camera thread, gets and encodes video data
    //---------------------------------------------
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraView(Context context, Camera camera) {
            super(context);
            Log.w("camera","camera view");
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(CameraView.this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            mCamera.setPreviewCallback(CameraView.this);
        }

        @Override
        public void surfaceCreated(SurfaceHolder holder) {
            try {
                stopPreview();
                mCamera.setPreviewDisplay(holder);
            } catch (IOException exception) {
                mCamera.release();
                mCamera = null;
            }
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
            Camera.Parameters camParams = mCamera.getParameters();
            camParams.setPreviewSize(imageWidth, imageHeight);
            camParams.setPreviewFrameRate(frameRate);
            mCamera.setParameters(camParams);
            startPreview();
        }

        @Override
        public void surfaceDestroyed(SurfaceHolder holder) {
            mHolder.addCallback(null);
            mCamera.setPreviewCallback(null);
        }

        public synchronized void startPreview() {
            if (!isPreviewOn && mCamera != null) {
                isPreviewOn = true;
                mCamera.startPreview();
            }
        }

        public synchronized void stopPreview() {
            if (isPreviewOn && mCamera != null) {
                isPreviewOn = false;
                mCamera.stopPreview();
            }
        }

        @Override
        public synchronized void onPreviewFrame(byte[] data, Camera camera) {
            /* get video data */
            if (yuvIplimage != null && isRecorderStart) {
                yuvIplimage.getByteBuffer().put(data);
                Log.i(LOG_TAG, "yuvIplimage put data");

            }
            Log.i(LOG_TAG, "onPreviewFrame - wrote bytes: " + data.length + "; " + 
                    camera.getParameters().getPreviewSize().width +" x " + 
                    camera.getParameters().getPreviewSize().height + "; frameRate: " +
                    camera.getParameters().getPreviewFrameRate());
        }
    }
}

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 12:48

GoogleCodeExporter commented 9 years ago
@florin This is what you need:

* Enhanced `FFmpegFrameRecorder` to support converting between audio sample
formats, for the experimental AAC encoder among other things
http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce65ae
041e74a9

Original comment by samuel.a...@gmail.com on 27 Jul 2012 at 2:54

GoogleCodeExporter commented 9 years ago
can you please compile it (for android 2.2 arm) and upload it? for some reason 
not managed to recompile the library at all :( I always get errors :( (using 
mac os X)

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 6:05

GoogleCodeExporter commented 9 years ago
No need to recompile everything, just those few files

Original comment by samuel.a...@gmail.com on 28 Jul 2012 at 1:23

GoogleCodeExporter commented 9 years ago
I've tried but my eclipse doesn't like this:
"  @Override 
    public Frame grabFrame() throws Exception {
        return grabFrame(true, true);
    }
"
:
The return type is incompatible with FrameGrabber.grabFrame()
but 
    private Frame grabFrame(boolean processImage, boolean doAudio) throws Exception {

Original comment by florin.m...@gmail.com on 28 Jul 2012 at 6:27

GoogleCodeExporter commented 9 years ago
So recompile FrameGrabber as well

Original comment by samuel.a...@gmail.com on 28 Jul 2012 at 6:58

GoogleCodeExporter commented 9 years ago
no luck :( 
FrameGrabber.java doesn't like:
import java.beans.PropertyEditorSupport;

and 
 public static class PropertyEditor extends PropertyEditorSupport {
        @Override public String getAsText() {
            Class c = (Class)getValue();
            return c == null ? "null" : c.getSimpleName().split("FrameGrabber")[0];
        }
        @Override public void setAsText(String s) {
            if (s == null) {
                setValue(null);
            }
            try {
                setValue(get(s));
            } catch (Exception ex) {
                throw new IllegalArgumentException(ex);
            }
        }
        @Override public String[] getTags() {
            return list.toArray(new String[list.size()]);
        }
    }

(PropertyEditorSupport cannot be resolved to a type)

So not having much luck :(
Can you please  compile it?

Original comment by florin.m...@gmail.com on 28 Jul 2012 at 11:11

GoogleCodeExporter commented 9 years ago
Then you don't have Java SE 6 installed. Even if I compile it, it's not going 
to work on your machine. Try again with Java SE 6...

Original comment by samuel.a...@gmail.com on 29 Jul 2012 at 2:09

GoogleCodeExporter commented 9 years ago
And I've just updated the `pom.xml` file so that we can recompile more easily 
just the Java source files. Try this command:
    mvn package -Pall -Djavacpp.skip=true

Original comment by samuel.a...@gmail.com on 29 Jul 2012 at 9:13

GoogleCodeExporter commented 9 years ago
I've compiled this 
(http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce65a
e041e74a9) 3 java files with my project and the Activity from 
zhangqia...@gmail.com on Jul 24, 2012 and I'm getting a lot of noise on my 
sound and the streaming sometimes stops.

Original comment by jorslb.i...@gmail.com on 20 Aug 2012 at 3:04

GoogleCodeExporter commented 9 years ago
hey, I want compile javacv using mvn package (and mvn package -Pall 
-Djavacpp.skip=true also has error). The error log is attached. I install 
opencv using mac port install opencv(2.4.2) and compile javacpp(where the 
javacpp.jar copy to? is target folder?)

Original comment by sh...@163.com on 20 Aug 2012 at 3:27

Attachments:

GoogleCodeExporter commented 9 years ago
Please try to install from the latest source of both JavaCPP *and* JavaCV 
repositories... I suppose I should add a SNAPSHOT qualifier to the version in 
the source repository?

Original comment by samuel.a...@gmail.com on 21 Aug 2012 at 6:18

GoogleCodeExporter commented 9 years ago
If I have the version 0.2 I need to add 
the(http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce
65ae041e74a9) 3 java files to have audio support?
I have tried with a lot of devices and always getting a lot of noise and really 
strange sounds. 

Original comment by jorslb.i...@gmail.com on 21 Aug 2012 at 8:32