chipweinberger / flutter_quick_video_encoder

Quickly encode raw RGB images & PCM audio to MP4 video using the hardware h264 encoder
The Unlicense
7 stars 0 forks source link

Android: UnsupportedColorFormat, YUV420Planar is not supported #3

Closed lrshu closed 3 months ago

lrshu commented 3 months ago

PlatformException(UnsupportedColorFormat, YUV420Planar is not supported, null, null),stack #0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:651:7)

chipweinberger commented 3 months ago

what device are you using? os version, etc

chipweinberger commented 3 months ago

this package requires the device to support hardware video encoding

lrshu commented 3 months ago
image

change color format to : COLOR_FormatYUV420Flexible can fix this issue.

chipweinberger commented 3 months ago

okay, open PR?

If planar is unsupported, fallback to flexible

chipweinberger commented 3 months ago

tbh, im not sure how flexible works

https://stackoverflow.com/questions/38421564/what-is-color-formatyuv420flexible

chipweinberger commented 3 months ago

looks like we'd need to change other code too to get the buffer an an image

chipweinberger commented 3 months ago

what formats does your device support?

lrshu commented 3 months ago
image
chipweinberger commented 3 months ago

Im not sure what formats those numbers are

chipweinberger commented 3 months ago

I think you are right that we should switch to flexible, but we need to update the code to be more flexible

lrshu commented 3 months ago

we can get the numbers here : https://minimum-viable-product.github.io/marshmallow-docs/reference/android/media/MediaCodecInfo.CodecCapabilities.html#COLOR_FormatYUV420Flexible

chipweinberger commented 3 months ago

so

OMX_QCOM_COLOR_FormatYUV420PackedSemiPlanar32m COLOR_QCOM_FormatYUV420SemiPlanar COLOR_FormatSurface COLOR_FormatYUV420Flexible COLOR_FormatYUV420SemiPlanar

chipweinberger commented 3 months ago

it would be easy to add support for

COLOR_FormatYUV420SemiPlanar Semi-Planar format: This format combines the U and V components into a single array of interleaved UV pairs, following the block of Y values. This results in two blocks: one for Y and one for UV.

chipweinberger commented 3 months ago

please open PR

either for SemiPlanar support or flexible support

please read this: https://stackoverflow.com/questions/38421564/what-is-color-formatyuv420flexible

chipweinberger commented 3 months ago

from chatgpt


To support the COLOR_FormatYUV420Flexible in your Android video encoding application, you need to modify how you manage and configure the input buffers, especially since you will be interfacing directly with the YUV format rather than just passing raw RGB or RGBA data. Here’s a step-by-step approach on how to adapt your current setup:

1. Modify Video Format Configuration

Change the video encoder configuration to accept the COLOR_FormatYUV420Flexible format, which corresponds to MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible. First, ensure that this format is supported by your device’s encoder:

private int getColorFormat() {
    MediaCodecInfo codecInfo = getCodecInfo("video/avc");
    if (codecInfo != null) {
        MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType("video/avc");
        for (int colorFormat : capabilities.colorFormats) {
            if (colorFormat == MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible) {
                return colorFormat;
            }
        }
    }
    return -1; // Default to -1 if color format is not supported
}

2. Adapt appendVideoFrame Method

Since the COLOR_FormatYUV420Flexible can represent various YUV formats, it's important to handle the input buffer according to the format selected by the encoder. When using COLOR_FormatYUV420Flexible, use the Image class to interface with the buffer instead of raw byte arrays:

case "appendVideoFrame":
    Image image = mVideoEncoder.getInputImage(inIdx);
    if (image != null) {
        fillImagePlanes(image, rawRgba, mWidth, mHeight);
    }
    mVideoEncoder.queueInputBuffer(inIdx, 0, calculateYUVSize(mWidth, mHeight), presentationTime, 0);
    break;

private void fillImagePlanes(Image image, byte[] rgba, int width, int height) {
    // Convert RGBA to YUV and fill the planes
    byte[] yuv = rgbaToYuv420Planar(rgba, width, height);
    ByteBuffer[] yuvPlanes = image.getPlanes();
    // Assuming YUV_420_888 format for simplicity; actual plane assignment may vary
    yuvPlanes[0].put(yuv, 0, width * height); // Y
    yuvPlanes[1].put(yuv, width * height, width * height / 4); // U (Cb)
    yuvPlanes[2].put(yuv, width * height + width * height / 4, width * height / 4); // V (Cr)
}

3. Implement calculateYUVSize Helper Function

This function will calculate the necessary size for the YUV buffer based on the video dimensions:

private int calculateYUVSize(int width, int height) {
    // YUV420 size calculation
    return width * height + 2 * (width / 2 * height / 2);
}

4. Ensure Proper Handling in onMethodCall

Check and handle the Image objects correctly in your appendVideoFrame method and ensure that the buffer sizes and formats are correctly computed and assigned.

5. Test Thoroughly

Test your implementation across different devices to ensure compatibility, as handling of COLOR_FormatYUV420Flexible can vary by manufacturer and GPU capabilities.

With these steps, your Android encoder application should be able to handle COLOR_FormatYUV420Flexible for YUV data input and manage the encoding process more flexibly and efficiently.

chipweinberger commented 3 months ago

im not sure calculateYUVSize is right

i think we should do this instead, using getInputBuffer+ remaining:

https://stackoverflow.com/questions/35394946/using-getinputimage-with-mediacodec-for-encoding

chipweinberger commented 3 months ago

fixed in 1.5.0

we now use COLOR_FormatYUV420Flexible