Open jacek-marchwicki opened 12 years ago
I just wanted to say that we're looking into these issues ourselves, thanks for spelling it all out. Hopefully we'll be able to contribute back in the future.
What kind of performance have you seen with this build? With libstagefright enabled, I'm getting slower performance than with sw decoder...still trying to debug why...fun stuff :-)
cjmoreira - I'm still trying to get it not to crash on my galaxy tab or nexus 10 :/
However, I do notice there is a lot of verbose frame by frame logging in player.c - this is a longshot by maybe that's slowing it down.
I did a performance analysis in eclipse, and the bulk of the time is spent rendering the bitmap. I'm going to Trying to figure out how to render on the native side (using opengl), and avoid using bitmaps.
Here's a place to start: http://vec.io/posts/how-to-render-image-buffer-in-android-ndk-native-code
It seems like the current code is looking at the frame buffer and copying the pixel data into java bitmaps. I'm new to this but that seems super slow. If you look at how AwesomePlayer.cpp does it, they pass pointer to a ANativeWindow into OMXCodec::Create, and then data can be copied there as frames decode. That's going to be a lot more efficient.
@acgourley Yes - we probably should pass ANativeWindow to OMXCoded - this is what I currently working on - but this enforce some huge changes to player code
If someone is interested there is a version in ver-very-very-pre-alfa-pre-working version https://review.appunite.com/#/c/3500/ that can play without synchronization some 720p movie
More changes with very-super-pre-alfa: https://review.appunite.com/#/c/1779/5 some 1080p videos now work with good synchronization on hardware decoder. Better performance of software decoder (faster rescaling and color conversion) with better quality of rescaling 480p. (tested on Galaxy Tab 2)
ps: I forgot about tree-way-sync of video/audio/clock - maybe will be ability to play video without audio
This is great. Can you push this to github as a branch or tell me how to get the code? I can't figure out how to pull it from your review git.
you have link to checkout: git fetch https://jacek.marchwicki@review.appunite.com/androidffmpeg refs/changes/79/1779/6 && git checkout FETCH_HEAD
but do not forget to add "GIT_SSL_NO_VERIFY=true"
I'm getting a black screen when enabling hardware decoding. Which version of the Android Source are you using? The version I'm using required that I change the following line in ffstagefright.cpp: from: int err = s->mNativeWindow->queueBuffer_DEPRECATED(window, gBuffer.get()); to: int err = s->mNativeWindow->queueBuffer(window, gBuffer.get());
Perhaps this is the cause of the problem for me....
I've had the blank screen issue occur when I was not lining up all source, libs and device to the same version of android. I think you're pointed roughly in the right direction as that API difference you described occurs between jellybean and ICS. An author of a popular android media player told me he compiles libraries for the major releases of android and dynamically loads them at runtime.
Did you encounter a hang when exiting the video? I haven't debugged it yet, just wondering...this occurred with both sw and hw decoder....
@cjmoreira method was renamed to DEPRECATED as far is I know in jelly bean. If you have older android source code you have to change this name - probably there should be some "#ifdef"
@cjmoreira Stopping video and seeking currently does not work
@acgourley I compiling this code with jelly bean source code of android, but testing on Galaxy TAB 2 with ICS and Nexus 4 with Jelly Bean
@cjmoreira Sorry.. there is black screen on my Jelly Bean Nexus 4 too
@jacek-marchwicki I haven't had time to debug this further. FYI, I also had the same issue on the nexus 7. Where did you guys download the latest Jelly Bean android source from? I haven't been able to find it....the official google source still only has the 4.1 release.
It have 4.2.2 R1 too
git ls-remote -h http://android.googlesource.com/platform/manifest.git | grep 4.2.2_r1
If you resolve this black screen issue, please inform me. Ill trying to figure out this problem but without progress.
I'm sorry I don't have time to dive into it for more detail, but I thought I would suggest that for jellybean you look at how MediaCodec http://developer.android.com/reference/android/media/MediaCodec.html renders to the surface which can be passed in during the call to: public void configure (MediaFormat format, Surface surface, MediaCrypto crypto, int flags)
@jacek-marchwicki
It is the queuBuffer that is failing on the Nexus 4: int err = s->mNativeWindow->queueBuffer(window, gBuffer.get()); av_log(s->avctx, AV_LOG_DEBUG, "thread - length6, err=%d\n", err);
It did not return an error, but there is always a message about the SurfaceView not being owned by the client. I don't know what this means yet....still digging. 03-15 00:14:47.586: V/ffmpeg(5882): thread - length6, err=0 03-15 00:14:47.596: E/BufferQueue(158): [SurfaceView] cancelBuffer: slot 5 is not owned by the client (state=2)
Updating to the latest 4.2.2 source code, and calling the un-deprecated function: int err = s->mNativeWindow->queueBuffer(window, gBuffer.get(), c);
03-15 01:59:03.947: E/SurfaceTextureClient(9984): queueBuffer: error queuing buffer to SurfaceTexture, -22
@jacek-marchwicki I got the error messages to go away by setting the MetaData "kKeyRendered" to 1 after queueBuffer. Unfortunately there's still a black screen for me.....I found this in the android reference AwesomePlayer.cpp. I'm still very new to this, so very slow to debug this....
int err = s->mNativeWindow->queueBuffer(window, gBuffer.get(), -1);
av_log(s->avctx, AV_LOG_DEBUG, "thread - length6, err=%d\n", err);
sp<MetaData> metaData = buffer->meta_data();
metaData->setInt32(kKeyRendered, 1);
Hello, may be you have any news about hardware accelerating support?
I can say we got this working for our application. We did not end up using this project, just rolling it ourselves, but I'm still happy to share what we learned. First of all, this will be something that must be built several times against several lib/source bundles in order to get coverage of most chipset/OS combos. Then the right library needs to be loaded dynamically depending on the device the application is running on.
Oh, may be you can share your project or libs?
@acgourley
By linking to major versions you mean api 9, 11, and 14 for example?
@pablo-navarro I'm not sure which versions, I actually never got this working in a uniform way, I just confirmed it would be required for pre-jellybean type devices both by inferring and by conversations with someone who has a hw enabled player in the market.
@ksotik Sorry for letting this slide. The best place to start is to carefully and fully read http://vec.io/posts/use-android-hardware-decoder-with-omxcodec-in-ndk and then to understand his examples will fail mysteriously if you don't do what I suggested above: make sure you have the source and libs for your android version so you can compile against them. If they are not lined up, it can fail in mysterious ways. I also learned a lot studying AwesomePlayer.cpp which is how android uses OMXCodec internally. I ended up NOT using FFMPEG because I didn't require the file format support.
@acgourley
I believe that the very popular player in the market you are referring links to, AOSP sources 9, 11 and 14. I believe so because you can see on its libs names something like dumblib.9.so, dumblib.11.so and dumblib.14.so
I also experience black screen on Jelly bean devices, I've tried lining up the AOSP sources, ndk-api-source with device version, with the same black screen results. I might be wrong but I think the black screen on JB it's not related to not lining up sources.
Anyway I have been playing with the code and I managed to show a couple of frames but then a black screen again.
What I did was forcing there was a decoded frame in ff_stagefright_decode_frame:
if (s->mOutDecodeError) {
av_log(avctx, AV_LOG_ERROR, "Error in decoder\n");
ret = -1;
// } else if (!is_buffer_empty(s)) {
} else if (1) {
av_log(s->avctx, AV_LOG_DEBUG, "decode_frame - there is a decoded frame\n");
int current = s->mOutFrameCurrent;
*pict = s->mOutFrameBuffer[current];
*data_size = sizeof(AVFrame);
s->mOutFrameCurrent = get_next_out_frame(current);
pthread_cond_broadcast(&s->mCond);
} else {
av_log(s->avctx, AV_LOG_DEBUG, "decode_frame - there is no decoded frame\n");
*data_size = 0;
}
Any Ideas someone?
Hi,thanks for your code,it's so prefect. I had some troble in using MediaCodec for hardware decoding .first through ffmpeg demuxer data source to video and audio stream,then transfer the video stream data to MediaCodec decode.Now it only can decode the m3u8 format,the other format all failed.When it run the mediaCodec.dequeueOutputBuffer(), returns MediaCodec.INFO_TRY_AGAIN_LATER all the time.While when i used the MediaExtractor to demuxer the data source,it returns normal.Maybe have some diference in the ffmpeg and MediaExtractor demuxer?if you learn some about this problem,please give me some surport.
@jacek-marchwicki @acgourley @cjmoreira Android 4.1+ black screen resovled?
I have resolved black screen, Main Code in AwesomePlayer.cpp
@crossle
Can you be more specific?
@crossle
Maybe you can give us a hint of what to look in awesomeplayer's code.
Thanks
@pablo-navarro , I'm developing Vitamio, Vitamio HW decoder refer to AwesomePlayer.cpp, I recompile libstagefright.so
push to my phone(of course, you must got root permission), open log #define LOG_NDEBUG 0
, add some log to AwesomePlayer.cpp
, look What's run on AwesomePlayer.
What's your issue now?
@crossle
So you are a Vitamio developer and you have solved the black screen issue in your project....Good for you.
But tell me, what's the point of coming here and comment on and open source project that you have solved your issues in your private closed source project? And on top of that not giving any useful information.
I resovled the main problem, before i forgot this line in AwesomePlayer.cpp. Hope to help you.
// Even if set scaling mode fails, we will continue anyway
setVideoScalingMode_l(mVideoScalingMode);
@crossle
Thanks, it's much appreciated.
I applied what you mention but the black screen issue here seems to be not related to setting scaling mode:
E/Surface(1581): getSlotFromBufferLocked: unknown buffer: 0x0
@pablo-navarro
Black screen may has many possibilities, You must sure your MediaBuffer
has buffer data.
I moved hardware decoding code from review to: https://github.com/appunite/AndroidFFmpeg/tree/hardware-decoding
Introduction
There is some work that I started to support hardware decoding via libstagefright: https://review.appunite.com/#/c/1779/. This is not on github repo because it is not finished commit - there are a lot work to do. Everyone are welcome to download this code and edit make patches.
Problems:
More detailes problems:
Finding solutions
I made some conversations with Simon Robinson who is interested in hardware decoding too. Those conversations pushed forward this subject.
Color conversion
Simon pointed that Galaxy OMX decoder use non standar stride multiples of 16 and additional extra bytes of padding. This issue occurs on Galaxy S3. Simon suggested that this issue can be solved buy enumerating AVC decoders and depending if this is MX.SEC.AVC.Decoder or OMX.SEC.avc.dec we can use different stride/padding mechanism. http://code.google.com/p/android/issues/detail?id=37768 http://mailman.videolan.org/pipermail/vlc-devel/2012-September/090263.html
Simon pointed that QUALCOMM hardware avc decoder use non standard NV12 ( 0x7FA30C0) colorspace on nexus 4. Probably much better way is to pass ANatvieWindow to decoder, then Android will automatically do color conversion and other necessary padding operation that will be needed.
Considerations: