DJI-Mobile-SDK-Tutorials / Android-VideoStreamDecodingSample

This sample project demonstrates how to use FFmpeg for video frame parsing and to use MediaCodec for hardware decoding on DJI Products.
MIT License
169 stars 80 forks source link

Why the .so file is only support andorid 5.x ? #1

Closed DennisGuo closed 6 years ago

DennisGuo commented 7 years ago

I can only run this example on android5.x or later . But there a lots of android4.x version devices. How Can I support those ?

When load libraries on android4.x devices , the error shows below:

` AndroidRuntime: FATAL EXCEPTION: main

java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "atof" referenced by "libffmpeg.so"...

at java.lang.Runtime.loadLibrary(Runtime.java:365)

at java.lang.System.loadLibrary(System.java:526) `

tito91 commented 7 years ago

Same error on Samsung Galaxy Note 4 (Android 4.4.4).

oliverou commented 7 years ago

Hi, thanks for your feedbacks. We will investigate this issue in these few days and get back to you once we make progress.

lillogoal commented 7 years ago

@oliverou please check here http://forum.dev.dji.com/forum.php?mod=viewthread&tid=33011&extra=page%3D1%26filter%3Dtypeid%26typeid%3D307 this is my issue

oliverou commented 7 years ago

Hi @lillogoal, I think you have changed the package name of the nativehelper file, this kind of log is quite common, please check the following page in Google Documentation: https://developer.android.com/training/articles/perf-jni.html

w zx8 7vn4yc6a kb65av

lillogoal commented 7 years ago

@oliverou Are you sure ? Maybe I have copied NativeHelper File and DJI.

NATIVE: `package com.wesii.wesiiplay.Media;

/**

DJIVIDEOSTREAMDECODER: ` package com.wesii.wesiiplay.Media;

    import android.annotation.TargetApi;
    import android.content.Context;
    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.os.Build;
    import android.os.Handler;
    import android.os.HandlerThread;
    import android.os.Message;
    import android.util.Log;
    import android.view.Surface;

    import java.io.IOException;
    import java.io.InputStream;
    import java.nio.ByteBuffer;
    import java.util.LinkedList;
    import java.util.Queue;
    import java.util.concurrent.ArrayBlockingQueue;

    import dji.sdk.base.DJIBaseProduct;
    import dji.sdk.sdkmanager.DJISDKManager;
    import dji.common.product.Model;
    /**
     * This class is a helper class for hardware decoding. Please follow the following steps to use it:
     *
     * 1. Initialize and set the instance as a listener of NativeDataListener to receive the frame data.
     *
     * 2. Send the raw data from camera to ffmpeg for frame parsing.
     *
     * 3. Get the parsed frame data from ffmpeg parsing frame callback and cache the parsed framed data into the frameQueue.
     *
     * 4. Initialize the MediaCodec as a decoder and then check whether there is any i-frame in the MediaCodec. If not, get
     * the default i-frame from sdk resource and insert it at the head of frameQueue. Then dequeue the framed data from the
     * frameQueue and feed it(which is Byte buffer) into the MediaCodec.
     *
     * 5. Get the output byte buffer from MediaCodec, if a surface(Video Previewing View) is configured in the MediaCodec,
     * the output byte buffer is only need to be released. If not, the output yuv data should invoke the callback and pass
     * it out to external listener, it should also be released too.
     *
     * 6. Release the ffmpeg and the MediaCodec, stop the decoding thread.
     */
    public class DJIVideoStreamDecoder implements NativeHelper.NativeDataListener  {
        private static final String TAG = DJIVideoStreamDecoder.class.getSimpleName();
        private static final int BUF_QUEUE_SIZE = 30;
        private static final int MSG_INIT_CODEC = 0;
        private static final int MSG_FRAME_QUEUE_IN = 1;
        private static final int MSG_DECODE_FRAME = 2;
        private static final int MSG_CHANGE_SURFACE = 3;
        private static final int CODEC_DEQUEUE_INPUT_QUEUE_RETRY = 20;
        public static final String VIDEO_ENCODING_FORMAT = "video/avc";

        private final boolean DEBUG = false;

        private static DJIVideoStreamDecoder instance;

        private Queue<DJIFrame> frameQueue;
        private HandlerThread dataHandlerThread;
        private Handler dataHandler;
        private HandlerThread callbackHandlerThread;
        private Handler callbackHandler;
        private Context context;
        private MediaCodec codec;
        private Surface surface;

        public int frameIndex = -1;
        private long currentTime;
        public int width;
        public int height;
        private boolean hasIFrameInQueue = false;
        private boolean hasIFrameInCodec;
        private ByteBuffer[] inputBuffers;
        private ByteBuffer[] outputBuffers;
        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        LinkedList<Long> bufferChangedQueue=new LinkedList<Long>();

        private long createTime;

        public interface IYuvDataListener {
            /**
             * Callback method for processing the yuv frame data from hardware decoder.
             * @param yuvFrame
             * @param width
             * @param height
             */
            void onYuvDataReceived(byte[] yuvFrame, int width, int height);
        }

        /**
         * Set the yuv frame data receiving callback. The callback method will be invoked when the decoder
         * output yuv frame data. What should be noted here is that the hardware decoder would not output
         * any yuv data if a surface is configured into, which mean that if you want the yuv frames, you
         * should set "null" surface when calling the "configure" method of MediaCodec.
         * @param yuvDataListener
         */
        public void setYuvDataListener(IYuvDataListener yuvDataListener) {
            this.yuvDataListener = yuvDataListener;
        }

        private IYuvDataListener yuvDataListener;

        /**
         * A data structure for containing the frames.
         */
        private static class DJIFrame {
            public byte[] videoBuffer;
            public int size;
            public long pts;
            public long incomingTimeMs;
            public long fedIntoCodecTime;
            public long codecOutputTime;
            public boolean isKeyFrame;
            public int frameNum;
            public long frameIndex;
            public int width;
            public int height;

            public DJIFrame(byte[] videoBuffer, int size, long pts, long incomingTimeUs, boolean isKeyFrame,
                            int frameNum, long frameIndex, int width, int height){
                this.videoBuffer=videoBuffer;
                this.size=size;
                this.pts =pts;
                this.incomingTimeMs=incomingTimeUs;
                this.isKeyFrame=isKeyFrame;
                this.frameNum=frameNum;
                this.frameIndex=frameIndex;
                this.width=width;
                this.height=height;
            }

            public long getQueueDelay()
            {
                return fedIntoCodecTime-incomingTimeMs;
            }

            public long getDecodingDelay()
            {
                return codecOutputTime-fedIntoCodecTime;
            }

            public long getTotalDelay()
            {
                return codecOutputTime-fedIntoCodecTime;
            }
        }

        private void logd(String tag, String log) {
            if (!DEBUG) {
                return;
            }
            Log.d(tag, log);
        }
        private void loge(String tag, String log) {
            if (!DEBUG) {
                return;
            }
            Log.e(tag, log);
        }

        private void logd(String log) {
            logd(TAG, log);
        }
        private void loge(String log) {
            loge(TAG, log);
        }

        private DJIVideoStreamDecoder() {
            createTime = System.currentTimeMillis();
            frameQueue = new ArrayBlockingQueue<DJIFrame>(BUF_QUEUE_SIZE);
            startDataHandler();
            callbackHandlerThread = new HandlerThread("callback handler");
            callbackHandlerThread.start();
            callbackHandler = new Handler(callbackHandlerThread.getLooper());
        }

        public static DJIVideoStreamDecoder getInstance() {
            if (instance == null) {
                instance = new DJIVideoStreamDecoder();
            }
            return instance;
        }

        /**
         * Initialize the decoder
         * @param context The application context
         * @param surface The displaying surface for the video stream. What should be noted here is that the hardware decoder would not output
         * any yuv data if a surface is configured into, which mean that if you want the yuv frames, you
         * should set "null" surface when calling the "configure" method of MediaCodec.
         */
        public void init(Context context, Surface surface) {
            this.context = context;
            this.surface = surface;
            NativeHelper.getInstance().init();

            NativeHelper.getInstance().setDataListener(this);
            if (dataHandler != null) {
                dataHandler.sendEmptyMessage(MSG_INIT_CODEC);
            }
        }

        /**
         * Framing the raw data from the camera.
         * @param buf Raw data from camera.
         * @param size Data length
         */
        public void parse(byte[] buf, int size) {
            logd( "parse data size: " + size);
            NativeHelper.getInstance().parse(buf, size);
        }

        /**
         * Get the resource ID of the IDR frame.
         * @param pModel Product model of connecting DJI product.
         * @param width Width of current video stream.
         * @return Resource ID of the IDR frame
         */
        public int getIframeRawId(Model pModel, int width) {
            int iframeId = dji.midware.R.raw.iframe_1280x720_ins;

            switch(pModel) {
                case Phantom_3_Advanced:
                case Phantom_3_Standard:
                    if (width==960) {
                        //for photo mode, 960x720, GDR
                        iframeId = dji.midware.R.raw.iframe_960x720_3s;
                    } else {
                        //for record mode, 1280x720, GDR
                        iframeId = dji.midware.R.raw.iframe_1280x720_3s;
                    }
                    break;

                case Phantom_3_4K:
                    switch(width) {
                        case 640:
                            //for P3-4K with resolution 640*480
                            iframeId = dji.midware.R.raw.iframe_640x480;
                            break;
                        case 848:
                            //for P3-4K with resolution 848*480
                            iframeId = dji.midware.R.raw.iframe_848x480;
                            break;
                        default:
                            iframeId = dji.midware.R.raw.iframe_1280x720_3s;
                            break;
                    }
                    break;

                case Osmo_Pro:
                case Osmo:
                    iframeId = -1;
                    break;

                case Phantom_4:
                    iframeId = dji.midware.R.raw.iframe_1280x720_p4;
                    break;

                default: //for P3P, Inspire1, etc/
                    iframeId = dji.midware.R.raw.iframe_1280x720_ins;
                    break;
            }

            return iframeId;
        }

        /** Get default black IDR frame.
         * @param width Width of current video stream.
         * @return IDR frame data
         * @throws IOException
         */
        private byte[] getDefaultKeyFrame(int width) throws IOException {
            DJIBaseProduct product = DJISDKManager.getInstance().getDJIProduct();
            if (product == null || product.getModel() == null) {
                return null;
            }
            int iframeId=getIframeRawId(product.getModel(), width);
            if (iframeId >= 0){

                InputStream inputStream = context.getResources().openRawResource(iframeId);
                int length = inputStream.available();
                logd("iframeId length=" + length);
                byte[] buffer = new byte[length];
                inputStream.read(buffer);
                inputStream.close();

                return buffer;
            }
            return null;
        }

        /**
         * Initialize the hardware decoder.
         */
        private void initCodec() {
            if (width == 0 || height == 0) {
                return;
            }
            if (codec != null) {
                releaseCodec();
            }
            loge("initVideoDecoder video width = " + width + "  height = " + height);
            // create the media format
            MediaFormat format = MediaFormat.createVideoFormat(VIDEO_ENCODING_FORMAT, width, height);
            if (surface == null) {
                logd("initVideoDecoder: yuv output");
                // The surface is null, which means that the yuv data is needed, so the color format should
                // be set to YUV420.
                format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
            } else {
                logd("initVideoDecoder: display");
                // The surface is set, so the color format should be set to format surface.
                format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
            }
            try {
                // Create the codec instance.
                codec = MediaCodec.createDecoderByType(VIDEO_ENCODING_FORMAT);
                logd( "initVideoDecoder create: " + (codec == null));
                // Configure the codec. What should be noted here is that the hardware decoder would not output
                // any yuv data if a surface is configured into, which mean that if you want the yuv frames, you
                // should set "null" surface when calling the "configure" method of MediaCodec.
                codec.configure(format, surface, null, 0);
                logd( "initVideoDecoder configure");
                //            codec.configure(format, null, null, 0);
                if (codec == null) {
                    loge("Can't find video info!");
                    return;
                }
                // Start the codec
                codec.start();
                logd( "initVideoDecoder start");
                // Get the input and output buffers of hardware decoder
                inputBuffers = codec.getInputBuffers();
                outputBuffers = codec.getOutputBuffers();
                logd( "initVideoDecoder get buffers");

            } catch (Exception e) {
                loge("init codec failed, do it again: " + e);
                if (e instanceof MediaCodec.CodecException) {
                    MediaCodec.CodecException ce = (MediaCodec.CodecException) e;
                }
                e.printStackTrace();
            }
        }

        private void startDataHandler() {
            if (dataHandlerThread != null && dataHandlerThread.isAlive()) {
                return;
            }
            dataHandlerThread = new HandlerThread("frame data handler thread");
            dataHandlerThread.start();
            dataHandler = new Handler(dataHandlerThread.getLooper()) {
                @Override
                public void handleMessage(Message msg) {
                    switch (msg.what) {
                        case MSG_INIT_CODEC:
                            try {
                                initCodec();
                            } catch (Exception e) {
                                loge("init codec error: " + e.getMessage());
                                e.printStackTrace();
                            }

                            removeCallbacksAndMessages(null);
                            sendEmptyMessageDelayed(MSG_DECODE_FRAME, 1);
                            break;
                        case MSG_FRAME_QUEUE_IN:
                            try {
                                onFrameQueueIn(msg);
                            } catch (Exception e) {
                                loge("queue in frame error: " + e);
                                e.printStackTrace();
                            }

                            if (!hasMessages(MSG_DECODE_FRAME)) {
                                sendEmptyMessage(MSG_DECODE_FRAME);
                            }
                            break;
                        case MSG_DECODE_FRAME:
                            try {
                                decodeFrame();
                            } catch (Exception e) {
                                loge("handle frame error: " + e);
                                if (e instanceof MediaCodec.CodecException) {
                                }
                                e.printStackTrace();
                                initCodec();
                            }finally {
                                if (frameQueue.size() > 0) {
                                    sendEmptyMessage(MSG_DECODE_FRAME);
                                }
                            }
                            break;
                        case MSG_CHANGE_SURFACE:

                            break;
                        default:
                            break;
                    }
                }
            };
            dataHandler.sendEmptyMessage(MSG_DECODE_FRAME);
        }

        /**
         * Stop the data processing thread
         */
        private void stopDataHandler() {
            if (dataHandlerThread == null || !dataHandlerThread.isAlive()) {
                return;
            }
            if (dataHandler != null) {
                dataHandler.removeCallbacksAndMessages(null);
            }
            if (Build.VERSION.SDK_INT >= 18) {
                dataHandlerThread.quitSafely();
            } else {
                dataHandlerThread.quit();
            }

            try {
                dataHandlerThread.join(3000);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }

            releaseCodec();
            dataHandler = null;
        }

        /**
         * Change the displaying surface of the decoder. What should be noted here is that the hardware decoder would not output
         * any yuv data if a surface is configured into, which mean that if you want the yuv frames, you
         * should set "null" surface when calling the "configure" method of MediaCodec.
         * @param surface
         */
        public void changeSurface(Surface surface) {
            this.surface = surface;
            initCodec();
        }

        /**
         * Release and close the codec.
         */
        private void releaseCodec() {
            if (frameQueue!=null){
                frameQueue.clear();
                hasIFrameInQueue = false;
                hasIFrameInCodec = false;
            }
            if (codec != null) {
                try {
                    codec.flush();
                } catch (Exception e) {
                    loge("flush codec error: " + e.getMessage());
                    codec = null;
                }

                try {
                    codec.release();
                } catch (Exception e) {
                    loge("close codec error: " + e.getMessage());
                } finally {
                    codec = null;
                }
            }
        }

        /**
         * Queue in the frame.
         * @param msg
         */
        private void onFrameQueueIn(Message msg) {
            DJIFrame inputFrame = (DJIFrame)msg.obj;
            if (inputFrame == null) {
                return;
            }
            if (!hasIFrameInQueue) { // check the I frame flag
                if (inputFrame.frameNum !=1 && !inputFrame.isKeyFrame) {
                    loge("the timing for setting iframe has not yet come.");
                    return;
                }
                byte[] defaultKeyFrame = null;
                try {
                    defaultKeyFrame = getDefaultKeyFrame(inputFrame.width); // Get I frame data
                } catch (IOException e) {
                    loge("get default key frame error: " + e.getMessage());
                }
                if (defaultKeyFrame != null) {
                    DJIFrame iFrame = new DJIFrame(
                            defaultKeyFrame,
                            defaultKeyFrame.length,
                            inputFrame.pts,
                            System.currentTimeMillis(),
                            inputFrame.isKeyFrame,
                            0,
                            inputFrame.frameIndex - 1,
                            inputFrame.width,
                            inputFrame.height
                    );
                    frameQueue.clear();
                    frameQueue.offer(iFrame); // Queue in the I frame.
                    logd("add iframe success!!!!");
                    hasIFrameInQueue = true;
                } else if (inputFrame.isKeyFrame) {
                    logd("onFrameQueueIn no need add i frame!!!!");
                    hasIFrameInQueue = true;
                } else {
                    loge("input key frame failed");
                }
            }
            if (inputFrame.width!=0 && inputFrame.height != 0 &&
                    inputFrame.width != this.width &&
                    inputFrame.height != this.height) {
                this.width = inputFrame.width;
                this.height = inputFrame.height;
       /*
        * On some devices, the codec supports changing of resolution during the fly
        * However, on some devices, that is not the case.
        * So, reset the codec in order to fix this issue.
        */
                loge("init decoder for the 1st time or when resolution changes");
                initCodec();
            }
            // Queue in the input frame.
            if (this.frameQueue.offer(inputFrame)){
                logd("put a frame into the Extended-Queue with index=" + inputFrame.frameIndex);
            } else {
                // If the queue is full, drop a frame.
                DJIFrame dropFrame = frameQueue.poll();
                this.frameQueue.offer(inputFrame);
                loge("Drop a frame with index=" + dropFrame.frameIndex+" and append a frame with index=" + inputFrame.frameIndex);
            }
        }

        /**
         * Dequeue the frames from the queue and decode them using the hardware decoder.
         * @throws Exception
         */
        @TargetApi(Build.VERSION_CODES.LOLLIPOP)
        private void decodeFrame() throws Exception {
            DJIFrame inputFrame = frameQueue.poll();
            if (inputFrame == null) {
                return;
            }
            if (codec == null) {
                initCodec();
            }
            int inIndex = -1;

            // Get input buffer index of the MediaCodec.
            for (int i = 0; i < CODEC_DEQUEUE_INPUT_QUEUE_RETRY && inIndex < 0; i ++) {
                try {
                    inIndex = codec.dequeueInputBuffer(0);
                } catch (IllegalStateException e) {
                    logd(TAG, "decodeFrame: dequeue input: " + e);
                    codec.stop();
                    codec.reset();
                    initCodec();
                    e.printStackTrace();
                }
            }
            logd(TAG, "decodeFrame: index=" + inIndex);

            // Decode the frame using MediaCodec
            if (inIndex >= 0) {
                ByteBuffer buffer = inputBuffers[inIndex];
                buffer.clear();
                buffer.rewind();
                buffer.put(inputFrame.videoBuffer);

                inputFrame.fedIntoCodecTime = System.currentTimeMillis();
                long queueingDelay = inputFrame.getQueueDelay();
                logd("input frame delay: " + queueingDelay);
                // Feed the frame data to the decoder.
                codec.queueInputBuffer(inIndex, 0, inputFrame.size, inputFrame.pts, 0);
                hasIFrameInCodec = true;

                // Get the output data from the decoder.
                int outIndex = -1;
                outIndex = codec.dequeueOutputBuffer(bufferInfo, 0);
                logd(TAG, "decodeFrame: outIndex: " + outIndex);
                if (outIndex >= 0) {
                    if (surface == null && yuvDataListener != null) {
                        // If the surface is null, the yuv data should be get from the buffer and invoke the callback.
                        logd("decodeFrame: need callback");
                        ByteBuffer yuvDataBuf = outputBuffers[outIndex];
                        yuvDataBuf.position(bufferInfo.offset);
                        yuvDataBuf.limit(bufferInfo.size - bufferInfo.offset);
                        final byte[] bytes = new byte[bufferInfo.size - bufferInfo.offset];
                        yuvDataBuf.get(bytes);
                        callbackHandler.post(new Runnable() {
                            @Override
                            public void run() {
                                yuvDataListener.onYuvDataReceived(bytes, width, height);
                            }
                        });
                    }
                    // All the output buffer must be release no matter whether the yuv data is output or
                    // not, so that the codec can reuse the buffer.
                    codec.releaseOutputBuffer(outIndex, true);
                } else if (outIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                    // The output buffer set is changed. So the decoder should be reinitialized and the
                    // output buffers should be retrieved.
                    long curTime = System.currentTimeMillis();
                    bufferChangedQueue.addLast(curTime);
                    if (bufferChangedQueue.size() >= 10) {
                        long headTime = bufferChangedQueue.pollFirst();
                        if (curTime - headTime < 1000) {
                            // reset decoder
                            loge("Reset decoder. Get INFO_OUTPUT_BUFFERS_CHANGED more than 10 times within a second.");
                            bufferChangedQueue.clear();
                            dataHandler.removeCallbacksAndMessages(null);
                            dataHandler.sendEmptyMessage(MSG_INIT_CODEC);
                            return;
                        }
                    }
                    if (outputBuffers == null) {
                        return;
                    }
                    outputBuffers = codec.getOutputBuffers();
                } else if (outIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                    loge("format changed, color: " + codec.getOutputFormat().getInteger(MediaFormat.KEY_COLOR_FORMAT));
                }
            }
        }

        /**
         * Stop the decoding process.
         */
        public void stop() {
            dataHandler.removeCallbacksAndMessages(null);
            frameQueue.clear();
            hasIFrameInQueue = false;
            hasIFrameInCodec = false;
            if (codec != null) {
                try {
                    codec.flush();
                } catch (IllegalStateException e) {
                }
            }
            stopDataHandler();
        }

        public void resume() {
            startDataHandler();
        }

        public void destroy() {
            // 2 new lines - maybe will help
            codec.stop();
            codec.release();
            NativeHelper.getInstance().release();
        }

        @Override
        public void onDataRecv(byte[] data, int size, int frameNum, boolean isKeyFrame, int width, int height) {
            if (dataHandler == null || dataHandlerThread == null || !dataHandlerThread.isAlive()) {
                return;
            }
            if (data.length != size) {
                loge( "recv data size: " + size + ", data lenght: " + data.length);
            } else {
                logd( "recv data size: " + size + ", frameNum: "+frameNum+", isKeyframe: "+isKeyFrame+"," +
                        " width: "+width+", height: " + height);
                currentTime = System.currentTimeMillis();
                frameIndex ++;
                DJIFrame newFrame = new DJIFrame(data, size, currentTime, currentTime, isKeyFrame,
                        frameNum, frameIndex, width, height);
                dataHandler.obtainMessage(MSG_FRAME_QUEUE_IN, newFrame).sendToTarget();

            }
        }
    }

`

oliverou commented 7 years ago

Hi @lillogoal, you have changed the package names of the above two files:

0ec381a0-0722-4149-8993-0e824d989607 51767985-84cc-4c8f-b454-631222f4dd39

You should change the corresponding method names in the jni/dji_video_jni.c file and rebuild the so libs.

Furthermore, you can check this StackOverFlow: http://stackoverflow.com/questions/16518490/java-lang-unsatisfiedlinkerror-jni

lillogoal commented 7 years ago

so if i implements all your sample project ?

lillogoal commented 7 years ago

@oliverou I have this problem with re-compiler c++ files.

NDK BUILD. ndk-build [armeabi-v7a] Compile thumb : djivideojni <= dji_video_jni.c jni/dji_video_jni.c:65:51: warning: passing 'uint8_t *' (aka 'unsigned char *') to parameter of type 'const jbyte *' (aka 'const signed char *') converts between pointers to integer types with different sign [-Wpointer-sign] (*env)->SetByteArrayRegion(env, jarray, 0, size, buf); ^~~ 1 warning generated. [armeabi-v7a] SharedLibrary : libdjivideojni.so D:/Users/Lorenzo/AppData/Local/Android/sdk/ndk-bundle/build//../toolchains/arm-linux-androideabi-4.9/prebuilt/windows-x86_64/lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin\ld: error: ./obj/local/armeabi-v7a/libffmpeg.so:1:9: syntax er ror, unexpected STRING D:/Users/Lorenzo/AppData/Local/Android/sdk/ndk-bundle/build//../toolchains/arm-linux-androideabi-4.9/prebuilt/windows-x86_64/lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin\ld: error: ./obj/local/armeabi-v7a/libffmpeg.so: not an object or archive jni/dji_video_jni.c:43: error: undefined reference to 'av_register_all' jni/dji_video_jni.c:45: error: undefined reference to 'av_codec_next' jni/dji_video_jni.c:78: error: undefined reference to 'avcodec_register_all' jni/dji_video_jni.c:79: error: undefined reference to 'av_register_all' jni/dji_video_jni.c:82: error: undefined reference to 'avcodec_find_decoder' jni/dji_video_jni.c:83: error: undefined reference to 'avcodec_alloc_context3' jni/dji_video_jni.c:84: error: undefined reference to 'av_parser_init' jni/dji_video_jni.c:97: error: undefined reference to 'avcodec_open2' jni/dji_video_jni.c:103: error: undefined reference to 'av_frame_alloc' jni/dji_video_jni.c:127: error: undefined reference to 'av_init_packet' jni/dji_video_jni.c:132: error: undefined reference to 'av_parser_parse2' jni/dji_video_jni.c:165: error: undefined reference to 'av_free_packet' jni/dji_video_jni.c:223: error: undefined reference to 'avcodec_close' jni/dji_video_jni.c:227: error: undefined reference to 'av_free' jni/dji_video_jni.c:228: error: undefined reference to 'av_free' jni/dji_video_jni.c:229: error: undefined reference to 'av_parser_close' clang++.exe: error: linker command failed with exit code 1 (use -v to see invocation) make: *** [obj/local/armeabi-v7a/libdjivideojni.so] Error 1

pisarik commented 7 years ago

Hello!

I successfully builded this sample. And it works fine on API >=21. Then this solution was embedded in my project. I replaced all neccessary methods in dji_video_jni.c with my packages. And my project working fine on API >= 21. When I trying to run it on my phone with Android API level 19, then i getting the following error:

FATAL EXCEPTION: main
Process: uiip.dji.pcapi.com, PID: 8922
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "atof" referenced by "libffmpeg.so"...
at java.lang.Runtime.loadLibrary(Runtime.java:364)
at java.lang.System.loadLibrary(System.java:526)
at uiip.dji.pcapi.com.media.NativeHelper.<clinit>(NativeHelper.java:65)
at uiip.dji.pcapi.com.MainActivity.onCreate(MainActivity.java:21)
at android.app.Activity.performCreate(Activity.java:5275)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2166)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2252)
at android.app.ActivityThread.access$800(ActivityThread.java:139)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1200)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5103)
at java.lang.reflect.Method.invokeNative(Native Method)

Foremost, I tried to set targetSdkVersion to 19 in Build.Gradle (Module: app) and add the line APP_PLATFORM := android-19 in libs/Application.mk and rebuild jni libs. But this didn't help.

Then I found this post. And decided to try to change compileSdkVersion to 19, I deleted all api-specific stuff as styles appCompat and also I made the method decodeFrame() in DJIVideoStreamDecoder empty (just as mock), because it contains method mediacodec.reset(), which is 21 level api. So it didn't help too.

Then I tried to compile with different older ndk versions (9b, 10e, 11c), and different Android Plugin Version (2.1.3). It didn't help.

Is it problem in libffmpeg.so? How should i deal with it?

dji-dev commented 6 years ago

Will go ahead and close this ticket as current version should have a fix for this already. Please feel free to reopen if the problem still exists. Thanks!