Closed bmegli closed 4 years ago
Wrong - MediaCodec is supported since FFmpeg 3.1
Got working proof of concept with:
FFmpeg was build by changing configure
file
SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'
SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR) $(SLIBNAME)'
with:
SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'
SLIB_INSTALL_LINKS='$(SLIBNAME)'
with script:
#!/bin/bash
NDK=/data/meglickib/android-ndk/android-ndk-r13b
SYSROOT=$NDK/platforms/android-24/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--target-os=android \
--enable-shared \
--disable-static \
--disable-doc \
--disable-ffplay \
--disable-ffprobe \
--disable-symver \
--enable-hwaccels \
--enable-jni \
--enable-mediacodec \
--enable-decoder=h264_mediacodec \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=android \
--arch=arm \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
$ADDITIONAL_CONFIGURE_FLAG
make clean all
make -j4
make install
}
CPU=arm
PREFIX=$(pwd)/android/$CPU
ADDI_CFLAGS="-marm"
build_one
NHVD was compiled with hand written Android.mk and Application.mk and ndk-build
Implementation changes include:
hmm, it seems this went through software path, not hardware (not through mediacodec)
The scripts, Android.mk, Application.mk and HVD software decoding path were added in:
NHVD android branch
Ok, got one step further with successfully initializing hardware with MediaCodec
For any lost souls following this path the catches are following:
In your shared library export JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM* vm, void* aReserved)
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM* vm, void* aReserved)
{
//store the pointer to virtual machine unless you can do everything you need from OnLoad
g_vm = vm;
//this is just to get something in adb logcat
__android_log_write(ANDROID_LOG_DEBUG, "hvd", "JNI ON LOAD\n");
//return the version you need, you may also check here if it is supportd
return JNI_VERSION_1_6;
}
JNIEnv *jni_env = 0;
int getEnvStat = (*g_vm)->GetEnv(g_vm,(void**) &jni_env, JNI_VERSION_1_6);
if (getEnvStat == JNI_EDETACHED)
{
__android_log_print(ANDROID_LOG_DEBUG, "hvd", "getenv not attached");
jint result=(*g_vm)->AttachCurrentThread(g_vm, &jni_env, NULL);
if(result != JNI_OK)
//error
}
else if (getEnvStat == JNI_OK)
__android_log_print(ANDROID_LOG_DEBUG, "hvd", "already attached\n", JNI_OK);
else if (getEnvStat == JNI_EVERSION)
__android_log_print(ANDROID_LOG_DEBUG, "hvd", "get env version not supported");
Without that you will get errors along the line "VM not set", I don't remember exactly.
av_jni_set_java_vm(g_vm, NULL);
h264_mediacodec
decoder and alloc contexte.g.
decoder=avcodec_find_decoder_by_name("h264_mediacodec");
decoder_ctx = avcodec_alloc_context3(decoder);
Without that you are going to get "Operation not permitted" in avcodec_open2
If you are decoding from file you can get it from avformat
if (avformat_open_input(&input_ctx, "your filename", NULL, NULL) != 0)
// ... error
if (avformat_find_stream_info(input_ctx, NULL) < 0) {
// ... error
if (av_find_best_stream(input_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &stream_decoder, 0) )
// ... error
video_stream = ret;
video = input_ctx->streams[video_stream];
Above is one more catch - you can override your "h264_mediacodec"
decoder in av_find_best_stream to "h264"
if you use the same variable for decoder.
Now the important part
if (avcodec_parameters_to_context(h->decoder_ctx, video->codecpar) < 0)
//error
So some of the fields set in avcodec_parameters_to_context are mandatory for MediaCodec (probably extradata)
if (( err = avcodec_open2(h->decoder_ctx, decoder, NULL)) < 0)
//error
Ok, successfully decoded in hardware (from file) and rendered it.
Some funny warnings present in logcat.
Data returned in NV12, needs some new shader implementation.
I still need a method of init from raw stream instead of file (and avformat).
So some of the fields set in avcodec_parameters_to_context are mandatory for MediaCodec (probably extradata)
Confirmed with test - removing extradata lead to failure in openning context
This is also confirmed by MediaCodec documentation - SPS and PPS needed as setup data
FWIW - the minimal codec context parameters to make it work are:
This however states only about this case (this device, this MediaCodec backend, software versions, etc.)
A rough idea how to extract extradata with bitstream filters was implemented in:
extract_extradata branch in 2b9bf0f0535ae46a4a4287ba411675d9d8374049 (may not be entirely correct but works)
All put together:
May be used to test if the idea will work.
Implemented rough idea and confirmed that it is possible to make it work.
When streaming to Unity (MediaCodec -> NV12 -> pointer to unity -> textures update -> shader) there doesn't seem to be improvement over software decoding.
There is occasional "hiccup" in the video, short but noticeable.
Other approaches need to be tested (e.g. decoding directly to Android surface and displaying this in Unity).
Ok, got one step further with successfully initializing hardware with MediaCodec
For any lost souls following this path the catches are following:
JNI OnLoad
In your shared library export
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM* vm, void* aReserved)
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM* vm, void* aReserved) { //store the pointer to virtual machine unless you can do everything you need from OnLoad g_vm = vm; //this is just to get something in adb logcat __android_log_write(ANDROID_LOG_DEBUG, "hvd", "JNI ON LOAD\n"); //return the version you need, you may also check here if it is supportd return JNI_VERSION_1_6; }
Attach your thread to VM if needed
JNIEnv *jni_env = 0; int getEnvStat = (*g_vm)->GetEnv(g_vm,(void**) &jni_env, JNI_VERSION_1_6); if (getEnvStat == JNI_EDETACHED) { __android_log_print(ANDROID_LOG_DEBUG, "hvd", "getenv not attached"); jint result=(*g_vm)->AttachCurrentThread(g_vm, &jni_env, NULL); if(result != JNI_OK) //error } else if (getEnvStat == JNI_OK) __android_log_print(ANDROID_LOG_DEBUG, "hvd", "already attached\n", JNI_OK); else if (getEnvStat == JNI_EVERSION) __android_log_print(ANDROID_LOG_DEBUG, "hvd", "get env version not supported");
Set Java VM for ffmpeg
Without that you will get errors along the line "VM not set", I don't remember exactly.
av_jni_set_java_vm(g_vm, NULL);
Find
h264_mediacodec
decoder and alloc contexte.g.
decoder=avcodec_find_decoder_by_name("h264_mediacodec"); decoder_ctx = avcodec_alloc_context3(decoder);
Before opening the context you HAVE to supply additional information
Without that you are going to get "Operation not permitted" in
avcodec_open2
If you are decoding from file you can get it from avformat
if (avformat_open_input(&input_ctx, "your filename", NULL, NULL) != 0) // ... error if (avformat_find_stream_info(input_ctx, NULL) < 0) { // ... error if (av_find_best_stream(input_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &stream_decoder, 0) ) // ... error video_stream = ret; video = input_ctx->streams[video_stream];
Above is one more catch - you can override your
"h264_mediacodec"
decoder in av_find_best_stream to"h264"
if you use the same variable for decoder.Now the important part
if (avcodec_parameters_to_context(h->decoder_ctx, video->codecpar) < 0) //error
So some of the fields set in avcodec_parameters_to_context are mandatory for MediaCodec (probably extradata)
You may finally open the context
if (( err = avcodec_open2(h->decoder_ctx, decoder, NULL)) < 0) //error
Is there any other way to open avcodec successfully from a stream instead of a file ? Because I don't have ‘codecpar’ to init CodecCtx and get the "operation not permitted"
Is there any other way to open avcodec successfully from a stream instead of a file ? Because I don't have ‘codecpar’ to init CodecCtx and get the "operation not permitted"
See https://github.com/bmegli/hardware-video-decoder/commit/087041cd4cef4a6edb69fe3df3d90aae492607b3, it wasn't entirely correct but hardware decoding worked.
I never got it working to my satisfaction though (something seemed incorrect).
You may also try your luck with av_probe_input_buffer
See also DJI streaming, another interesting way to do it (somewhat differently, as far as I remember, MediaCodec used from Java and FFmpeg was helper to extract information for MediaCodec init)
Possibly ask on FFmpeg groups what is the idiomatic way to do it
I eventually got tired of Android where Java is the first class citizen and switched to Linux.
I no longer need this feature.
Now, is any way use ffmpeg mediacodec in andoird ?
Not through this library. Ffmpeg in general yes.
it seem don't have any open source using ffmpeg for hardware decode on android :'(
I used H264 bitstream parser of webrtc source to parse sps, pps data from H264 bitstream then assign it to extradata. Currently it work oke but dont know whether it is stable
This will probably need:
The implementation will need a separate branch for 4.x FFmpeg.
While switching to FFmpeg 4.x we may simplify the PixelFormat hacks that we use, there is infrastructure for that in FFmpeg 4.x. This is already in comments in relevant places of
hvd.c