DJI-Mobile-SDK-Tutorials / Android-VideoStreamDecodingSample

This sample project demonstrates how to use FFmpeg for video frame parsing and to use MediaCodec for hardware decoding on DJI Products.
MIT License
170 stars 81 forks source link

Video Stream Decoding: how to retrieve NV21 data from MediaCodec output #8

Open giannisergente opened 7 years ago

giannisergente commented 7 years ago

I'm trying to start from "Video Stream Decoding Sample" demo in order to obtain raw video data from DJI Phantom 3 Professional drone, and pass it to my Augmented Reality framework (Wikitude SDK). Particularly, I need to pass YUV 420 format data, arranged to be compliant to the NV21 standard to my framework, so I'm trying to retrieve this data from the MediaCodec output.

About this point, I tried to retrieve bytebuffers from the MediaCodec output (and this is possible by setting Surface parameter to null into configure() method, which have the effect to invoke a callback and pass it out to an external listener), but I'm having some issues about colours in visualization, because the encoded video colour is not right (blue and red seem to be reversed, and there is too much noise when camera moves).. (please note that, when I pass a Surface not null, after the instruction codec.releaseOutputBuffer(outIndex, true), MediaCodec renders frames on that and shows video stream properly, but I need to pass the video stream to Wikitude Plugin and so I must set surface to null).

I tried to set different MediaFormat.KEY_COLOR_FORMAT but none of them works properly. How can I proceed in order to retrieve NV21 data from MediaCodec output?

dji-dev commented 7 years ago

Hi @giannisergente, could you help to provide procedures to reproduce the issue (There is too much noise when camera moves)? There is a screenshot feature in the demo, please try to save the screenshots (Press the Screen Shot button) when you move the camera and check if the screenshots has noise. Thanks!

giannisergente commented 7 years ago

Hi @dji-dev. We tried to use the screenshot() method (by setting Surface parameter to null), and we observed that the colours are correct in the jpg files saved, but the noise persists when camera moves. Could be this issue related to our version of Android (5.0.2 - API 21)? We observe less noise on other smartphones (we are working on a SAMSUNG GALAXY GRAND PRIME)...

https://ibb.co/fKo0dv https://ibb.co/jeej5a https://ibb.co/mrNhrF

Turning back to our principal issue, actually if we pass to our AR plugin, the result of the NV21 conversion (without using any jpeg conversion) there is no more format colours mismatch, and the render is correct (even if with the presence of noise).


//nv21test
byte[] bytes = new byte[yuvFrame.length];
System.arraycopy(y, 0, bytes, 0, y.length);
for (int i = 0; i < u.length; i++) {
    bytes[y.length + (i * 2)] = nv[i];
    bytes[y.length + (i * 2) + 1] = nu[i];
}

//callback to our AR rendering routine accepting NV21 data frame
notifyNewCameraFrameN21(bytes);  

Now the problem is that, doing this for all the YUV frames coming from MediaCodec, there is a strong delay and the app crashes after few seconds. More precisely, this is the error:

02-11 12:08:41.336 E/AndroidRuntime( 2631): java.lang.OutOfMemoryError: Failed to allocate a 1382412 byte allocation with 298062 free bytes and 291KB until OOM
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at com.dji.videostreamdecodingsample.MainActivity.onYuvDataReceived(MainActivity.java:686)
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at com.dji.videostreamdecodingsample.media.DJIVideoStreamDecoder$3.run(DJIVideoStreamDecoder.java:1305)
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at android.os.Handler.handleCallback(Handler.java:739)
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at android.os.Handler.dispatchMessage(Handler.java:95)
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at android.os.Looper.loop(Looper.java:135)
02-11 12:08:41.336 E/AndroidRuntime( 2631):     at android.os.HandlerThread.run(HandlerThread.java:61)

related to this line of code:

//nv21test 
byte[] bytes = new byte[yuvFrame.length];          <--

We tried also to add the following line in the Manifest, but without great improvements. android:largeHeap="true"

Any suggestions?

niflying commented 7 years ago

@giannisergente Great, we have the same issue.
https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/issues/11

Did you work it out?

giannisergente commented 7 years ago

hi @niflying, with regard to the first issue (retrieving NV21 data from MediaCodec output) we bypassed the problem by using not-null Surface parameter and by letting MediaCodec to render on it directly (this is sufficient for our tasks, by using the SurfaceView layer). So, when MediaCodec renders directly on the Surface there is no more colours mismatch. Regarding the Surface=null case, unfortunately we don't have a solution yet in order to convert correctly the frames.

Regarding the noise on the video stream, it persists in our tests, and it may depend on ISO and shutter speed parameters. We observe a different behaviour, depending on the used device. We would like to try on a tablet, but actually there is a libraries problem (because ffmpeg library is not compiled for x86 architecture).

niflying commented 7 years ago

Oh, That really sad. Because I need bitmap data every frame. then send them to unity3d. I don't konw how to get it until I found this Sample. I tought, I could got the bitmap in YUVimg from this sample. But I only got broken images. It's take me almost one week here. god help me. Any one know how to got camera preview's bitmap from DJI SDK? @dji-dev

niflying commented 7 years ago

hi @giannisergente I think we path the same way. I use a SurfaceTexture then render it. But it looks like I only get the light from camera. A flicker red screen. If I cover the camera, it will turn dark.
img_3809

I tried to use the same code for show the phone camera. It works fine.

I think maybe the problem is on the draw part.

My Draw code, Please have a look .

public class DirectDrawer {

   private final String vertexShaderCode =
        "attribute vec4 vPosition;" +
                "attribute vec2 inputTextureCoordinate;" +
                "varying vec2 textureCoordinate;" +
                "void main()" +
                "{" +
                "gl_Position = vPosition;" +
                "textureCoordinate = inputTextureCoordinate;" +
                "}";

private final String fragmentShaderCode =
        "#extension GL_OES_EGL_image_external : require\n" +
                "precision mediump float;" +
                "varying vec2 textureCoordinate;\n" +
                "uniform samplerExternalOES s_texture;\n" +
                "void main() {" +
                "  gl_FragColor = texture2D( s_texture, textureCoordinate );\n" +
                "}";

private FloatBuffer vertexBuffer, textureVerticesBuffer;
private ShortBuffer drawListBuffer;
private final int mProgram;
private int mPositionHandle;
private int mTextureCoordHandle;

private short drawOrder[] = {0, 1, 2, 0, 2, 3}; // order to draw vertices

// number of coordinates per vertex in this array
private static final int COORDS_PER_VERTEX = 2;

private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex

static float squareCoords[] = {
        -1.0f, 1.0f,
        -1.0f, -1.0f,
        1.0f, -1.0f,
        1.0f, 1.0f,
};

static float textureVertices[] = {
        0.0f, 1.0f,
        1.0f, 1.0f,
        1.0f, 0.0f,
        0.0f, 0.0f,
};

private int texture;

public DirectDrawer(int texture) {
    this.texture = texture;
    // initialize vertex byte buffer for shape coordinates
    ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
    bb.order(ByteOrder.nativeOrder());
    vertexBuffer = bb.asFloatBuffer();
    vertexBuffer.put(squareCoords);
    vertexBuffer.position(0);

    // initialize byte buffer for the draw list
    ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
    dlb.order(ByteOrder.nativeOrder());
    drawListBuffer = dlb.asShortBuffer();
    drawListBuffer.put(drawOrder);
    drawListBuffer.position(0);

    ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);
    bb2.order(ByteOrder.nativeOrder());
    textureVerticesBuffer = bb2.asFloatBuffer();
    textureVerticesBuffer.put(textureVertices);
    textureVerticesBuffer.position(0);

    int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
    int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);

    mProgram = GLES20.glCreateProgram();             // create empty OpenGL ES Program
    GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
    GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
    GLES20.glLinkProgram(mProgram);                  // creates OpenGL ES program executables
}

public void draw(float[] mtx) {
    GLES20.glUseProgram(mProgram);

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);

    // get handle to vertex shader's vPosition member
    mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");

    // Enable a handle to the triangle vertices
    GLES20.glEnableVertexAttribArray(mPositionHandle);

    // Prepare the <insert shape here> coordinate data
    GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);

    mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
    GLES20.glEnableVertexAttribArray(mTextureCoordHandle);

    textureVerticesBuffer.clear();
    textureVerticesBuffer.put(transformTextureCoordinates(textureVertices, mtx));
    textureVerticesBuffer.position(0);

    GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureVerticesBuffer);
    GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
    // Disable vertex array
    GLES20.glDisableVertexAttribArray(mPositionHandle);
    GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
}

private int loadShader(int type, String shaderCode) {

    // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
    // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
    int shader = GLES20.glCreateShader(type);

    // add the source code to the shader and compile it
    GLES20.glShaderSource(shader, shaderCode);
    GLES20.glCompileShader(shader);

    return shader;
}

private float[] transformTextureCoordinates(float[] coords, float[] matrix) {
    float[] result = new float[coords.length];
    float[] vt = new float[4];

    for (int i = 0; i < coords.length; i += 2) {
        float[] v = {coords[i], coords[i + 1], 0, 1};
        Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
        result[i] = vt[0];
        result[i + 1] = vt[1];
    }
    return result;
}

`

tuanbipa commented 7 years ago

@giannisergente @dji-dev @niflying hi guys i have the same issue, i receive NV21 data to record video but there is a strong delay.

            byte[] bytes = new byte[yuvFrame.length];
            System.arraycopy(y, 0, bytes, 0, y.length);
            for (int i = 0; i < u.length; i++) {
                bytes[y.length + (i * 2)] = nv[i];
                bytes[y.length + (i * 2) + 1] = nu[i];
            }
            opencv_core.IplImage yuvImage1 = opencv_core.IplImage.create(width, height * 3 / 2, IPL_DEPTH_8U, 1);
            yuvImage1.getByteBuffer().put(bytes);
            opencv_core.IplImage bgrImage = opencv_core.IplImage.create(width, height, IPL_DEPTH_8U, 3);
            Bitmap modifiedBitmap = IplImageToBitmap(bgrImage);
            org.bytedeco.javacv.AndroidFrameConverter converter = new AndroidFrameConverter();
            Frame modifiedFrame = converter.convert(modifiedBitmap);
raullalves commented 7 years ago

Check this: https://github.com/raullalves/DJI-Drone-Camera-Streaming

oliverou commented 6 years ago

Hi all, we help to fix some bugs in the sample, please get the latest code and check if the issues still exist.

rautsunil commented 6 years ago

Hi , I am facing issue while generating the YUV frames , once is set surface == null in DjiVideodecoder.onsurfacechange() for couple of times yuvlistner is getting called but letter the the codec.dequeueOutputBuffer(bufferInfo, 0); reuslts to -1 and inIndex = codec.dequeueInputBuffer(0) = -1 ,Anyone knows the reason for it ?

oliverou commented 6 years ago

Hi @raullalves, could you provide us the following info to investigate the issue?

Thanks!

hweaving commented 4 years ago

hi @giannisergente I think we path the same way. I use a SurfaceTexture then render it. But it looks like I only get the light from camera. A flicker red screen. If I cover the camera, it will turn dark. img_3809

I tried to use the same code for show the phone camera. It works fine.

Did you ever get this working? I have the same exact problem with a flickering red screen that changes if I cover the camera. It works fine if I link the SurfaceTexture to a TextureView instead of trying to sample it, but I want to sample the feed for scaling and modification.

neilyoung commented 4 years ago

The key is, that this yuvCallback returns a completely unpredictable YUV ordering. It can be NV21, NV12, I420 or YVU12. It depends not even on the drone you are using, but just on the capabilities of the Android device, on which this app is running. On my demand they have added a "format indicator" in 4.11, but they messed it up and that format indicator does not help. More on this here: https://github.com/accuware/djistreamerlib/wiki

hweaving commented 4 years ago

@neilyoung I'm having trouble finding the source of djistreamerlib, but are you referring to the YUV callback provided by DJICodecManager? Or are you referring to the result of using a samplerExternalOES in a fragment shader?

My understanding was that samplerExternalOES should allow for sampling a YUV surface for use in rendering RGB directly. The fact that attaching the SurfaceTexture to a TextureView renders correctly makes me think it might be possible.

neilyoung commented 4 years ago

@hweaving There is no. Closed source, but you can use the lib for free. I'm referring to the yuvdatacallback provided by DJI, which carries an YUV buffer of unknown format. Disregard my comment, if it does not match your problem please.

hweaving commented 4 years ago

Thanks anyway. My problem was specifically the samplerExternalOES path since @niflying and I used the same approach and had the same result.

neilyoung commented 4 years ago

OK, forgive me. I just saw the caption and boom :)