hqq128 / javacv

Automatically exported from code.google.com/p/javacv
GNU General Public License v2.0
0 stars 0 forks source link

FFmpegFrameRecorder has no support for audio #160

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
hi, samuel, 
    I am new to Android, android I want to use javacv/ffmpeg to encode camera data into .flv(h.264/aac), can javacv/ffmpeg be used in this way?
    Thank you very much!
What steps will reproduce the problem?
1.I have got data from Android camera(in onPreviewFrame(byte[] data, Camera 
camera))
2.I want to use JavaCV/FFmpeg to encode the data into flv(h.264/aac)
3.Can javacv/ffmpeg be used in this way?

What is the expected output? What do you see instead?
I just want to know that can javacv/ffmpeg be used in this way? And it would be 
better there is a demo, ha.

What version of the product are you using? On what operating system?
andriod 2.3, lenovo S2005A

Please provide any additional information below.
I would very much appreciate if you help me, thank you.

Original issue reported on code.google.com by zhangqia...@gmail.com on 27 Feb 2012 at 3:22

GoogleCodeExporter commented 9 years ago
and the permission:
    <uses-sdk android:minSdkVersion="8" />
    <uses-permission android:name="android.permission.WAKE_LOCK" />
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.RECORD_AUDIO"/>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />

Original comment by zhangqia...@gmail.com on 24 Jul 2012 at 9:16

GoogleCodeExporter commented 9 years ago
Hi Zhang , 

I have audio file and video file separately . In this case how to merge both of 
them ... please advice me .. 

Original comment by itsrajes...@gmail.com on 24 Jul 2012 at 1:40

GoogleCodeExporter commented 9 years ago
On the 0.02 release, your code is not working. On a 0.01 release when i try to 
publish to a rtmp, not working (server saying invalid format).
I've attempted to make your code work for 0.02 version (record to memory first) 
and now i get 
07-27 16:40:25.063: W/System.err(6349): 
com.googlecode.javacv.FrameRecorder$Exception: Could not open audio codec
Is AAC not in last ffmpeg published in download section?
Zhang can you try to publish the camera successful to a wowza server?

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 12:47

GoogleCodeExporter commented 9 years ago
package com.camera_test;

import static com.googlecode.javacv.cpp.avcodec.*;
import static com.googlecode.javacv.cpp.avutil.*;
import static com.googlecode.javacv.cpp.opencv_core.*;
import static com.googlecode.javacv.cpp.opencv_core.IplImage;
import com.googlecode.javacv.FFmpegFrameRecorder;
import static com.googlecode.javacv.FrameRecorder.Exception;

import java.io.File;
import java.io.IOException;

import java.nio.ByteBuffer;

import android.app.Activity;
import android.content.Context;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.RelativeLayout;
import android.widget.Toast;

public class RecordActivity extends Activity {
    private final static String LOG_TAG = "RecordActivity";
    private PowerManager.WakeLock mWakeLock;

    private boolean isAudioRecording = false;

    private File videoFile;
    private String fileStr = "/mnt/sdcard/stream2.flv";
    private String ffmpeg_link;
    public volatile boolean isRecorderStart = false;
    public volatile boolean isVideoStart = false;

    private volatile FFmpegFrameRecorder recorder;

    /* the parameter of ffmpeg setting */
    private final static int sampleAudioRateInHz = 11025;
    private final static int sampleAudioBitRate = 32000;
    private final static int imageWidth = 176;
    private final static int imageHeight = 144;
    private final static int frameRate = 8;
    private final static int sampleVideoBitRate = 200000;

    /* audio data getting thread */
    private AudioRecord audioRecord;
    private Thread audioView;

    /* video data getting thread */
    private Camera cameraDevice;
    private CameraView cameraView;
    private boolean isPreviewOn = false;
    private IplImage yuvIplimage = null;

    /* layout setting */
    private final int bg_screen_bx = 232;
    private final int bg_screen_by = 128;
    private final int bg_screen_width = 700;
    private final int bg_screen_height = 500;
    private final int bg_width = 1123;
    private final int bg_height = 715;
    private final int live_width = 640;
    private final int live_height = 480;
    private int screenWidth, screenHeight;
    private Button btnRecorderControl;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.record_layout);
        Log.w("camera","PM stuff");
        /* manager - keep phone waking up */
        PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
        mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST"); 
        mWakeLock.acquire(); 
        Log.w("camera","PM stuff done");

        initLayout();

        Log.w("camera","after init layout");
        videoFile=new File(fileStr);
        if (!videoFile.exists()) {
            try {   
                videoFile.createNewFile();
                Log.i(LOG_TAG, "create videoFile success");
            } catch (IOException e) {
                Log.e(LOG_TAG, "create videoFile failed");
            }
        } else {
            Log.i(LOG_TAG, "videoFile exited already");
        }

        ffmpeg_link = videoFile.getAbsolutePath();
        //ffmpeg_link="rtmp://88.208.200.115:1935/live/15/memberCam";

        initRecorder();
    }

    @Override
    protected void onResume() {
        super.onResume();

        if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST");
           mWakeLock.acquire();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();

        isAudioRecording = false;

        if (cameraView != null) {
            cameraView.stopPreview();       
            cameraDevice.release();
            cameraDevice = null;
        }

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
        isRecorderStart = false;
    }

    //---------------------------------------
    // initialize layout   
    //---------------------------------------
    private void initLayout() {

        /* get size of screen */
        Log.w("camera","1");
        Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
        Log.w("camera","2");
        screenWidth = display.getWidth();
        screenHeight = display.getHeight();
        Log.w("camera","4");
        RelativeLayout.LayoutParams layoutParam = null; 
        Log.w("camera","5");
        LayoutInflater myInflate = null; 
        Log.w("camera","6");
        myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
        Log.w("camera","7");
        RelativeLayout topLayout = new RelativeLayout(this);
        Log.w("camera","8");
        setContentView(topLayout);
        Log.w("camera","8");
        RelativeLayout preViewLayout = (RelativeLayout) myInflate.inflate(R.layout.record_layout, null);
        Log.w("camera","9");
        layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
        Log.w("camera","10");
        topLayout.addView(preViewLayout, layoutParam);
        Log.w("camera","11");

        /* add control button: start and stop */
        btnRecorderControl = (Button) findViewById(R.id.recorder_control);
        Log.w("camera","12");
        btnRecorderControl.setOnClickListener(mControlAction);
        Log.w("camera","13");

        /* add camera view */
        int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
        int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
        int prev_rw, prev_rh;
        if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
            prev_rh = display_height_d;
            prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
        } else {
            prev_rw = display_width_d;
            prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
        }
        layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
        layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
        layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

        cameraDevice = Camera.open();
        Log.i(LOG_TAG, "cameara open");
        //Log.w(LOG_TAG,"many cam:"+String.valueOf(cameraDevice.getNumberOfCameras()));
        cameraView = new CameraView(this, cameraDevice);
        topLayout.addView(cameraView, layoutParam);
        Log.i(LOG_TAG, "cameara preview start: OK");

    }

    //---------------------------------------
    // recorder control button 
    //---------------------------------------
    private OnClickListener mControlAction = new OnClickListener() {
        @Override
        public void onClick(View v) {

            if (!isRecorderStart) {
                try {
                    recorder.start();
                } catch (Exception e) {
                    // TODO Auto-generated catch block
                    e.printStackTrace();
                }
                isRecorderStart = true;
                Log.w("camera","btn stop");
                btnRecorderControl.setBackgroundResource(R.drawable.btn_record_stop);
            } else if (isRecorderStart) {
                isRecorderStart = false;
                Log.w("camera","btn start");
                btnRecorderControl.setBackgroundResource(R.drawable.btn_record_start);
            }
        }
    };

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {

        if (keyCode == KeyEvent.KEYCODE_BACK) {
            if (isRecorderStart) {
                Toast.makeText(RecordActivity.this, R.string.before_finish, 1000).show();
            } else {
                RecordActivity.this.finish();
            }
            return true;
        }

        return super.onKeyDown(keyCode, event);
    }

    //---------------------------------------
    // initialize ffmpeg_recorder
    //---------------------------------------
    private void initRecorder() {

        //--------------------------------------
        // get intent data: ffmpeg link
        //--------------------------------------
        Log.w("camera","init recorder");
        Bundle bundle = new Bundle();
        bundle = this.getIntent().getExtras();

        if (bundle != null) {
            ffmpeg_link = bundle.getString("link_url");
        }

        if (yuvIplimage == null) {
            yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
            Log.i(LOG_TAG, "create yuvIplimage");
        }

        Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
        recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight,1);
//      AVCodec codec;

        recorder.setAudioCodec(CODEC_ID_AAC);

        //recorder.setAudioSampleRate(sampleAudioRateInHz);
        recorder.setAudioBitrate(sampleAudioBitRate);
        recorder.setAudioChannels(1);
        recorder.setVideoCodec(CODEC_ID_FLV1);
        //recorder.setCodecVideoID(CODEC_ID_FLV1);
        recorder.setFrameRate(frameRate);
        recorder.setVideoBitrate(sampleVideoBitRate);

        recorder.setPixelFormat(PIX_FMT_YUV420P);
        recorder.setFormat("flv");

        Log.i(LOG_TAG, "recorder initialize success");

        audioView = new Thread(new AudioRecordThread());
        audioView.start();
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordThread implements Runnable {
        @Override
        public void run() {
            int bufferLength = 0;
            int bufferSize;
            byte[] audioData;
            int bufferReadResult;

            try {
                bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);

                if (bufferSize <= 2048) {
                    bufferLength = 2048;
                } else if (bufferSize <= 4096) {
                    bufferLength = 4096;
                }

                /* set audio recorder parameters, and start recording */
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferLength);
                audioData = new byte[bufferLength];
                audioRecord.startRecording();
                Log.d(LOG_TAG, "audioRecord.startRecording()");

                isAudioRecording = true;

                /* ffmpeg_audio encoding loop */
                while (isAudioRecording) {
                    bufferReadResult = audioRecord.read(audioData, 0, audioData.length);

                    if (bufferReadResult == 1024 && isRecorderStart) {
                        byte[] realAudioData1024 = new byte[bufferReadResult];
                        System.arraycopy(audioData, 0, realAudioData1024, 0, bufferReadResult);

                        recorder.record(yuvIplimage);
                        recorder.record(ByteBuffer.wrap(realAudioData1024, 0, 1024) );
                        //recorder.record(realAudioData1024);
                        Log.d(LOG_TAG, "recorder.record(): bufferReadResult == 1024");

                    } else if (bufferReadResult == 2048 && isRecorderStart) {
                        byte[] realAudioData2048_1 = new byte[1024];
                        byte[] realAudioData2048_2 = new byte[1024];

                        System.arraycopy(audioData, 0, realAudioData2048_1, 0, 1024);
                        System.arraycopy(audioData, 1024, realAudioData2048_2, 0, 1024);

                        for (int i = 0; i < 2; i++) {
                            if (i == 0) {
                                recorder.record(yuvIplimage);//, 
                                recorder.record(ByteBuffer.wrap(realAudioData2048_1, 0, 1024) );

                            } else if (i == 1) {
                                recorder.record(yuvIplimage);//, 
                                recorder.record(ByteBuffer.wrap(realAudioData2048_2, 0, 1024) );
                            }
                        }
                        Log.d(LOG_TAG, "recorder.record(): bufferReadResult == 2048");
                    }
                }

                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    audioRecord.stop();
                    audioRecord.release();
                    audioRecord = null;
                }

                if (recorder != null && isRecorderStart) {
                    try {
                        recorder.stop();
                        recorder.release();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                    recorder = null;
                }
            } catch (Exception e) {
                Log.e(LOG_TAG, "get audio data failed");
            }

        }
    }

    //---------------------------------------------
    // camera thread, gets and encodes video data
    //---------------------------------------------
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraView(Context context, Camera camera) {
            super(context);
            Log.w("camera","camera view");
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(CameraView.this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            mCamera.setPreviewCallback(CameraView.this);
        }

        @Override
        public void surfaceCreated(SurfaceHolder holder) {
            try {
                stopPreview();
                mCamera.setPreviewDisplay(holder);
            } catch (IOException exception) {
                mCamera.release();
                mCamera = null;
            }
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
            Camera.Parameters camParams = mCamera.getParameters();
            camParams.setPreviewSize(imageWidth, imageHeight);
            camParams.setPreviewFrameRate(frameRate);
            mCamera.setParameters(camParams);
            startPreview();
        }

        @Override
        public void surfaceDestroyed(SurfaceHolder holder) {
            mHolder.addCallback(null);
            mCamera.setPreviewCallback(null);
        }

        public synchronized void startPreview() {
            if (!isPreviewOn && mCamera != null) {
                isPreviewOn = true;
                mCamera.startPreview();
            }
        }

        public synchronized void stopPreview() {
            if (isPreviewOn && mCamera != null) {
                isPreviewOn = false;
                mCamera.stopPreview();
            }
        }

        @Override
        public synchronized void onPreviewFrame(byte[] data, Camera camera) {
            /* get video data */
            if (yuvIplimage != null && isRecorderStart) {
                yuvIplimage.getByteBuffer().put(data);
                Log.i(LOG_TAG, "yuvIplimage put data");

            }
            Log.i(LOG_TAG, "onPreviewFrame - wrote bytes: " + data.length + "; " + 
                    camera.getParameters().getPreviewSize().width +" x " + 
                    camera.getParameters().getPreviewSize().height + "; frameRate: " +
                    camera.getParameters().getPreviewFrameRate());
        }
    }
}

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 12:48

GoogleCodeExporter commented 9 years ago
@florin This is what you need:

* Enhanced `FFmpegFrameRecorder` to support converting between audio sample
formats, for the experimental AAC encoder among other things
http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce65ae
041e74a9

Original comment by samuel.a...@gmail.com on 27 Jul 2012 at 2:54

GoogleCodeExporter commented 9 years ago
can you please compile it (for android 2.2 arm) and upload it? for some reason 
not managed to recompile the library at all :( I always get errors :( (using 
mac os X)

Original comment by florin.m...@gmail.com on 27 Jul 2012 at 6:05

GoogleCodeExporter commented 9 years ago
No need to recompile everything, just those few files

Original comment by samuel.a...@gmail.com on 28 Jul 2012 at 1:23

GoogleCodeExporter commented 9 years ago
I've tried but my eclipse doesn't like this:
"  @Override 
    public Frame grabFrame() throws Exception {
        return grabFrame(true, true);
    }
"
:
The return type is incompatible with FrameGrabber.grabFrame()
but 
    private Frame grabFrame(boolean processImage, boolean doAudio) throws Exception {

Original comment by florin.m...@gmail.com on 28 Jul 2012 at 6:27

GoogleCodeExporter commented 9 years ago
So recompile FrameGrabber as well

Original comment by samuel.a...@gmail.com on 28 Jul 2012 at 6:58

GoogleCodeExporter commented 9 years ago
no luck :( 
FrameGrabber.java doesn't like:
import java.beans.PropertyEditorSupport;

and 
 public static class PropertyEditor extends PropertyEditorSupport {
        @Override public String getAsText() {
            Class c = (Class)getValue();
            return c == null ? "null" : c.getSimpleName().split("FrameGrabber")[0];
        }
        @Override public void setAsText(String s) {
            if (s == null) {
                setValue(null);
            }
            try {
                setValue(get(s));
            } catch (Exception ex) {
                throw new IllegalArgumentException(ex);
            }
        }
        @Override public String[] getTags() {
            return list.toArray(new String[list.size()]);
        }
    }

(PropertyEditorSupport cannot be resolved to a type)

So not having much luck :(
Can you please  compile it?

Original comment by florin.m...@gmail.com on 28 Jul 2012 at 11:11

GoogleCodeExporter commented 9 years ago
Then you don't have Java SE 6 installed. Even if I compile it, it's not going 
to work on your machine. Try again with Java SE 6...

Original comment by samuel.a...@gmail.com on 29 Jul 2012 at 2:09

GoogleCodeExporter commented 9 years ago
And I've just updated the `pom.xml` file so that we can recompile more easily 
just the Java source files. Try this command:
    mvn package -Pall -Djavacpp.skip=true

Original comment by samuel.a...@gmail.com on 29 Jul 2012 at 9:13

GoogleCodeExporter commented 9 years ago
I've compiled this 
(http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce65a
e041e74a9) 3 java files with my project and the Activity from 
zhangqia...@gmail.com on Jul 24, 2012 and I'm getting a lot of noise on my 
sound and the streaming sometimes stops.

Original comment by jorslb.i...@gmail.com on 20 Aug 2012 at 3:04

GoogleCodeExporter commented 9 years ago
hey, I want compile javacv using mvn package (and mvn package -Pall 
-Djavacpp.skip=true also has error). The error log is attached. I install 
opencv using mac port install opencv(2.4.2) and compile javacpp(where the 
javacpp.jar copy to? is target folder?)

Original comment by sh...@163.com on 20 Aug 2012 at 3:27

Attachments:

GoogleCodeExporter commented 9 years ago
Please try to install from the latest source of both JavaCPP *and* JavaCV 
repositories... I suppose I should add a SNAPSHOT qualifier to the version in 
the source repository?

Original comment by samuel.a...@gmail.com on 21 Aug 2012 at 6:18

GoogleCodeExporter commented 9 years ago
If I have the version 0.2 I need to add 
the(http://code.google.com/p/javacv/source/detail?r=0f48fac8fa1840b848d07f8a02ce
65ae041e74a9) 3 java files to have audio support?
I have tried with a lot of devices and always getting a lot of noise and really 
strange sounds. 

Original comment by jorslb.i...@gmail.com on 21 Aug 2012 at 8:32

GoogleCodeExporter commented 9 years ago
My recommendation would be to try the latest code yes.

Original comment by samuel.a...@gmail.com on 21 Aug 2012 at 8:34

GoogleCodeExporter commented 9 years ago
Hey samuel, I download latest javaCPP code and compile with command 'mvn 
package', Now I want to compile javacv and I have downloaded the latest source 
code. The question is where should the javacpp.jar put? And I install 
opencv2.4.2 with macport, how config the pom.xml or other configuration file? 
Thanks!

Original comment by sh...@163.com on 21 Aug 2012 at 9:59

GoogleCodeExporter commented 9 years ago
I've compile javacpp with success but can't compile javacv.
Using Windows, OpenCV 2.4.2, Visual Studio Command Prompt 2010 and the command 
"mvn package".
With parameter -e it returns:
[INFO] Trace
org.apache.maven.lifecycle.LifecycleExecutionException: Command execution failed
.
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa
ultLifecycleExecutor.java:719)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLi
fecycle(DefaultLifecycleExecutor.java:556)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau
ltLifecycleExecutor.java:535)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan
dleFailures(DefaultLifecycleExecutor.java:387)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen
ts(DefaultLifecycleExecutor.java:348)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi
fecycleExecutor.java:180)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:362)
        at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:6
0)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
        at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
        at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)

        at org.codehaus.classworlds.Launcher.main(Launcher.java:375)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution fai
led.
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:345)
        at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi
nManager.java:490)
        at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa
ultLifecycleExecutor.java:694)
        ... 17 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an erro
r: 2(Exit value: 2)
        at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecut
or.java:346)
        at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:
149)
        at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:589)

        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:335)
        ... 19 more

Original comment by jorslb.i...@gmail.com on 21 Aug 2012 at 3:58

Attachments:

GoogleCodeExporter commented 9 years ago
On Linux it compiled with success but are some files missing. Like 
avcodec.class and avutil.class

Original comment by jorslb.i...@gmail.com on 21 Aug 2012 at 5:29

GoogleCodeExporter commented 9 years ago
Tried to compile with "mvn package -Pall" with error 
"/home/jorge/javacv/src/main/java/com/googlecode/javacv/ImageTransformerCL.java:
[23,24] package com.jogamp.opencl does not exist" even adding the required jocl 
and jogl files on Libraries after open the project in NetBeans.

Tried to remove all problem files like ImageTransformerCL, JavaCVCL, etc, but 
getting error "ARToolKitPlus/template.h: No such file or directory"

Tried to compile your ARToolKitPlus, but getting error 
"ARToolKitPlus/MemoryManager.h: No such file or directory"

Can please you distribute the compile code, or explain how to compile javacv?
There are a lot of people that can't compile.

Thanks
Jorge

Original comment by jorslb.i...@gmail.com on 22 Aug 2012 at 11:00

GoogleCodeExporter commented 9 years ago
Compiled with success with -Pall extracting ARToolKitPlus to the javacv folder, 
but getting:
(...)
08-22 14:25:26.767: W/dalvikvm(5168): No implementation found for native 
Lcom/googlecode/javacpp/Pointer;.allocate (Ljava/nio/Buffer;)
(...)
08-22 14:25:26.777: E/AndroidRuntime(5168):     at 
com.googlecode.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:654)
(...)

Original comment by jorslb.i...@gmail.com on 22 Aug 2012 at 11:30

GoogleCodeExporter commented 9 years ago
Don't call with the `-Pall` option, try to call `mvn package` or `mvn install` 
as mentioned in the README.txt file, which also states that JavaCPP is required.

Original comment by samuel.a...@gmail.com on 22 Aug 2012 at 12:09

GoogleCodeExporter commented 9 years ago
And if you want to make a build contribution, simply attach here, I'm sure it 
would make a lot of people happy :) thank you

Original comment by samuel.a...@gmail.com on 22 Aug 2012 at 12:10

GoogleCodeExporter commented 9 years ago
I compiled success using 'mvn install' in javacpp and javacv folder. But when I 
use the compiled javacv.jar and javacpp.jar replace version 0.2 it said: 'The 
import com.googlecode.javacv.cpp.avcodec cannot be resolved'. I'm using 
ffmpegframerecorder for android. So, need I recompile ffmpeg?

Original comment by sh...@163.com on 22 Aug 2012 at 1:45

GoogleCodeExporter commented 9 years ago
I have the same problem here.
With it excludes avcodec, avutil, etc because of 
'<exclude.libav>**/cpp/av*.java</exclude.libav>' and maybe other excludes, but 
without that excludes the project doesn't compile.

Original comment by jorslb.i...@gmail.com on 22 Aug 2012 at 2:21

GoogleCodeExporter commented 9 years ago
Or `mvn package -Pffmpeg` and yes you need FFmpeg to compile that of course

Original comment by samuel.a...@gmail.com on 22 Aug 2012 at 2:24

GoogleCodeExporter commented 9 years ago
where should I put the source of ffmpeg0.11? or where should ffmpeg library 
install to?

Original comment by sh...@163.com on 22 Aug 2012 at 3:03

GoogleCodeExporter commented 9 years ago
I've installed ffmpeg following 
"https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide" and running ffmpeg 
-version I get:
"
ffmpeg version 0.11.1.git
built on Aug 22 2012 15:57:54 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
configuration: --enable-gpl --enable-libfaac --enable-libmp3lame 
--enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp 
--enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 
--enable-nonfree --enable-version3 --enable-x11grab
libavutil      51. 70.100 / 51. 70.100
libavcodec     54. 54.100 / 54. 54.100
libavformat    54. 25.104 / 54. 25.104
libavdevice    54.  2.100 / 54.  2.100
libavfilter     3. 13.100 /  3. 13.100
libswscale      2.  1.101 /  2.  1.101
libswresample   0. 15.100 /  0. 15.100
libpostproc    52.  0.100 / 52.  0.100
"
but fails running `mvn package -Pffmpeg` with error:
In file included from 
/home/jorge/javacv/target/classes/com/googlecode/javacv/cpp/jniavcodec.cpp:47:0:
/usr/local/include/libavcodec/xvmc.h:30:33: fatal error: X11/extensions/XvMC.h: 
No such file or directory

Original comment by jorslb.i...@gmail.com on 22 Aug 2012 at 3:13

GoogleCodeExporter commented 9 years ago
Get FFmpeg from here, as stated in the README.txt file:
http://code.google.com/p/javacv/downloads/list

Original comment by samuel.a...@gmail.com on 23 Aug 2012 at 2:00

GoogleCodeExporter commented 9 years ago
Hey Samuel, I read the README.txt times still not got the idea. Do you mean 
download ffmpeg@0.11.1 and patch with ffmpeg-android-20120716.patch and then 
compile ffmpeg?

Original comment by sh...@163.com on 25 Aug 2012 at 10:46

GoogleCodeExporter commented 9 years ago
I compiled success(download ffmpeg@0.11.1 source code and patch with 
'ffmpeg-android-20120716.patch') on my mac pro. But after I copy javacv.jar and 
javacpp.jar to my android app libs folder. got this error when i using 
[recorder.record(ByteBuffer.wrap(realAudioData1024, 0, 1024));] to record audio.

08-26 00:06:10.498: E/AndroidRuntime(5600): java.lang.UnsatisfiedLinkError: 
allocate
08-26 00:06:10.498: E/AndroidRuntime(5600):     at 
com.googlecode.javacpp.Pointer.allocate(Native Method)
08-26 00:06:10.498: E/AndroidRuntime(5600):     at 
com.googlecode.javacpp.Pointer.<init>(Pointer.java:59)
08-26 00:06:10.498: E/AndroidRuntime(5600):     at 
com.googlecode.javacpp.BytePointer.<init>(BytePointer.java:45)

I only found a 'javacv-macosx-x86_64.jar' in target folder of javacv. how to 
compile for android?

Original comment by sh...@163.com on 25 Aug 2012 at 4:15

GoogleCodeExporter commented 9 years ago
ffmpeg-0.11.1-android-arm.zip already have the AAC and H264 encoder?
I want to send stream(CODEC_ID_FLV1/CODEC_ID_AAC).

Original comment by jorslb.i...@gmail.com on 25 Aug 2012 at 4:41

GoogleCodeExporter commented 9 years ago
@shoou For Android, please follow the instructions in the README.txt file.

@jorslb AAC, yes, H264, no. You can find the configuration in the README.txt 
file.

Original comment by samuel.a...@gmail.com on 28 Aug 2012 at 2:14

GoogleCodeExporter commented 9 years ago
Hey,

I missing com.googlecode.javacv.cpp.avformat.ByteIOContex somehow

any ideas?

thanks, gilush

Original comment by Gilus...@gmail.com on 30 Aug 2012 at 8:32

GoogleCodeExporter commented 9 years ago
After Reading all threads I'm really curious to know, Can I stream live camera 
capture to any RTSP / RTMP url ?.

Original comment by dpa...@gmail.com on 12 Sep 2012 at 12:57

GoogleCodeExporter commented 9 years ago
@Gilush14 Try again with a newer version of JavaCV.

@dpakrk If it works with FFmpeg, sure.

Please post your questions on the mailing list next time if possible, thank you.

Original comment by samuel.a...@gmail.com on 15 Sep 2012 at 4:53

GoogleCodeExporter commented 9 years ago
hi
can u provide me latest source code for streaming audio and video to rtmp 
server.

thanx

Original comment by arun.go...@cyberlinks.in on 17 Sep 2012 at 1:35

GoogleCodeExporter commented 9 years ago
hi

can anybody tell me that how to record video in flv format with audio in 
android.

thanx..

Original comment by arun.go...@cyberlinks.in on 20 Sep 2012 at 5:40

GoogleCodeExporter commented 9 years ago
hey,

i have compiled libx264 with ffmpeg7.0 .now i want to use this ffmpeg7.0 for 
OpenCV.so please can someone tell me more about how i compile OpenCv with my 
ffmpeg7.0 to support libx264

thanks

Original comment by imrankha...@gmail.com on 31 Oct 2012 at 7:18

GoogleCodeExporter commented 9 years ago
I've just released JavaCV 0.3 and it fixes a lot of related issues people keep 
adding to this issue, so please try it out and let me know if things are still 
not working right, thanks!

Original comment by samuel.a...@gmail.com on 5 Nov 2012 at 11:48

GoogleCodeExporter commented 9 years ago
Hi Samuel, nice job!
Is it possible to record (FFmpegFrameRecorder) raw elementary video streams 
(.h264, .mp4v) without using containers (.mp4, .3gp)?

Original comment by pariz...@gmail.com on 5 Nov 2012 at 10:02

GoogleCodeExporter commented 9 years ago
@parizene Should work yes, try to call `FFmpegFrameRecorder.setFormat("m4v")` 
or whatever format supported by FFmpeg.

Original comment by samuel.a...@gmail.com on 6 Nov 2012 at 1:59

GoogleCodeExporter commented 9 years ago
tried v0.3 and now my sounds sounds real strange (chipmunk style :P ) Any idea 
why?
I've attached the .flv and my android app.
Any help much appreciated.
Thanks

Original comment by florin.m...@gmail.com on 6 Nov 2012 at 3:01

Attachments:

GoogleCodeExporter commented 9 years ago
try to change sample rate

Original comment by pariz...@gmail.com on 6 Nov 2012 at 4:34

GoogleCodeExporter commented 9 years ago
 private final static int sampleAudioBitRate = 44100; did the trick. Thanks

Original comment by florin.m...@gmail.com on 7 Nov 2012 at 10:21

GoogleCodeExporter commented 9 years ago
yeah its working fine. can u tell me how i'll sampleAudioBitRate  manage for 
different devices.

Original comment by goyal.ar...@gmail.com on 8 Nov 2012 at 8:31

GoogleCodeExporter commented 9 years ago
with the current updated version of javacv is it possible to convert from 
android's camera yuv to h264 and then stream to rtmp server?

Original comment by MartinYa...@gmail.com on 15 Nov 2012 at 9:53

GoogleCodeExporter commented 9 years ago
@MartinYanakiev Sure, do as per comment 31.

Original comment by samuel.a...@gmail.com on 15 Nov 2012 at 12:50

GoogleCodeExporter commented 9 years ago
On 31 I see that:

IplImage image = IplImage.create(width, height, IPL_DEPTH_8U, 2);
image.getByteBuffer().put(data);
theFFmpegFrameRecorder.record(image);

In this case, data[] is yuv, and my FFmpegFrameRecorder is initialized like 
this:

recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
                recorder.setAudioBitrate(sampleAudioBitRate);
                recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
                recorder.setFrameRate(frameRate);
                recorder.setVideoBitrate(sampleVideoBitRate);
                recorder.setPixelFormat(PIX_FMT_YUV420P);
                recorder.setFormat("m4v");

The error that I receive with the updated JavaCV is that the video codec cant 
be found.

Original comment by MartinYa...@gmail.com on 15 Nov 2012 at 1:45