bytedeco / javacv

Java interface to OpenCV, FFmpeg, and more
Other
7.57k stars 1.58k forks source link

Refactor examples etc. to use android camera2 #163

Open lfdversluis opened 9 years ago

lfdversluis commented 9 years ago

hardware.camera is deprecated. The examples should be updated to use camera2.

If I find some time I will take a look at it.

saudet commented 9 years ago

Great, thanks!

lfdversluis commented 9 years ago

I noticed that the HttpBuilder is also removed and as of sdk 23 you should use URLConnection, see this post.

I was thinking, once I toy around with a sample of my own, I may want to include http://square.github.io/okhttp/. @saudet, are you ok with using this library? It's well-known and handles some errors and stuff.

saudet commented 9 years ago

Where are we using HttpBuilder? I'm not seeing it in the samples.

Anyway, the sample directory in JavaCV is meant to contain small self-contained samples as a sort of reference for the API. For more complex samples, they are very welcome as well, but it's probably best to provide projects files as well, and they would belong in this repository: https://github.com/bytedeco/sample-projects What do you think? Sounds good?

lfdversluis commented 9 years ago

Ah sorry my bad. It's been a while since I have been working on some experiments using the (awesome) library. I had some HTTP call and assumed it belonged to the sample, which I should've checked before. Ignore my previous post :smile:

Once I toy around a bit again with the current version, I hope to find some time to look into the camera2 API

kmlx commented 8 years ago

I've written the following in #298:

Use RGBA_8888, 8 bytes, 4 channels

yuvImage = new Frame(width, height, Frame.DEPTH_UBYTE, 4);

Initialise the ImageReader with PixelFormat.RGBA_8888, like so:

imageReader = ImageReader.newInstance(VIDEO_WIDTH, VIDEO_HEIGHT, PixelFormat.RGBA_8888, 1);
imageReader.setOnImageAvailableListener(onImageAvailableListener, null);

then on ImageReader.onImageAvailableListener get the bytes and process the image:

ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(final ImageReader reader){
            mBackgroundHandler.post(new Runnable() {
                @Override
                public void run() {
                    Image img = reader.acquireNextImage();
                    final ByteBuffer buffer = img.getPlanes()[0].getBuffer();
                    byte[] bytes = new byte[buffer.remaining()];
                    buffer.get(bytes, 0, bytes.length);
                    img.close();
...
                     ((ByteBuffer) yuvImage.image[0].position(0)).put(bytes);
                     ffmpegRecorder.record(yuvImage);

...
                }
            });
lfdversluis commented 8 years ago

@kmlx Hey thanks for sharing that piece of knowledge. I actually did some work on capturing the frames but ran into the issue where the ImageReader cannot be initialized with NV21. I am unfortunately not that familiar with imaging etc. (but I learned now what image strides and planes are!) So your answer is one of the pieces I was still looking for.

lfdversluis commented 8 years ago

@kmlx I recalled having tried what you suggested. I have reimplemented it and confirmed that I got what a i saw before: java.nio.BufferOverflowException. I printed both buffer sizes and noticed that the buffer from the image is much bigger than the yuvImage buffer.

kmlx commented 8 years ago

@lfdversluis Not a problem. Thank you @saudet and @lfdversluis for providing support and keeping these forums alive!

Regarding your issue, I'm assuming you're adding the imageReader as a target to the previewBuilder. In which case you'll need to make sure that the preview images aren't too big to fit in the yuvImage. For example, you'll need to choose an optimal image size:

mVideoSize = chooseVideoSize(map.getOutputSizes(SurfaceTexture.class));
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
                            width, height, mVideoSize);

In my case, this outputs an mPreviewSize of 768x432, and the yuvImage width and height are set at 320x240. Of course, I'm getting all of these from chooseVideoSize and chooseOptimalSize, both available in the Camera2Video sample from google. This produces a decent quality stream, with minimal CPU usage.

Also, RGBA_8888 has 4 channels, 8 bytes/channel, so yuvImage will need to be instantiated as:

yuvImage = new Frame(width, height, Frame.DEPTH_UBYTE, 4);

where: width - image width height - frame height Frame.DEPTH_UBYTE - 8 bytes/channel 4 - 4 channels

Or you could make yuvImage bigger in order to fit your images. But once they fit, one way or the other, you should see an RGB stream on the other end.

lfdversluis commented 8 years ago

@kmlx Thanks for your response. If I initialize the ImageReader like this mImageReader = ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, PixelFormat.RGBA_8888, 2); and my Frame like yuvImage = new Frame(DISPLAY_WIDTH, DISPLAY_HEIGHT, Frame.DEPTH_UBYTE, 4); then I do get the buffer overflow exception. Interesting that it works for you. To be clear, I am targeting the record example.

kmlx commented 8 years ago

@lfdversluis

  1. I wouldn't recommend targeting the record example. Best case scenario would be to convert the record example into a ffmpeg-only class. Then do all the other processing in another camera class. Opening a camera2 is completely different from camera, and the record example doesn't apply.
  2. Image formats are device dependent. You will need to find out what formats are supported by the device using isOutputSupportedFor (docs). If the device does not support rgb then I would recommend converting the images from the imageReader to rgb using renderscript (e.g. ScriptIntrinsicYuvToRGB link)
saudet commented 8 years ago

Hi guys, I've released version 1.2 :) Any updates on this?

lfdversluis commented 8 years ago

Hi @kmlx it has been a while, but I haven't forgotten this issue yet :)

Do you have that code sample with camera2 and using the mPreviewSize like you mentioned? I have not yet been able to make it work, so I am curious what I am missing here. I am not an imaging expert or anything close, so if you have a snippet maybe I can spot my mistake by comparing the approaches.

rahulsnitd1014 commented 8 years ago

Hi @kmlx I am having the same problem. Can't able to record using camera 2 . Getting Green screen on recorded video. Any solutions will be highly appreciated .

kmlx commented 8 years ago

@lfdversluis @rahulsnitd1014 Start here. Then follow-up here.

You should be able to get javacv to work with the Camera2Video sample from Google. That should be your objective. Then you'll be hitting this.

rahulsnitd1014 commented 8 years ago

Thanks.

I have already seen these and implemented but its still giving green frame.

vishalghor commented 7 years ago

Hi @lfdversluis @saudet ,

where u able to implement FFmpegframerecorderor sample RecordActivity.java for Camera2 api.If yes then can u please share the git repo for it ,it would be really helpfull. Thanks

lfdversluis commented 7 years ago

Hi @vishalghor, I managed to create some code that converts an Image object from an ImageReader to a NV21 byte array that can be used with the RecordActivity like it is now. I have not (yet) created an Activity that uses such an ImageReader in combination with the camera2 API.

kmlx commented 7 years ago

@vishalghor check the previous messages in this thread. there's a blocker in adding camera2 example: not all devices support PixelFormat.RGBA_8888. ffmpeg frame recorder relies on rgb byte arrays; this format is not available on all cameras when using camera2; FYI you can fully access rgb byte arrays for any device using camera1 API.

more detail (camera2basic demo Camera2BasicFragment.java#L515):

imageReader = ImageReader.newInstance(VIDEO_WIDTH, VIDEO_HEIGHT, PixelFormat.RGBA_8888, 1);

PixelFormat.RGBA_8888 is not supported on all devices. JPEG is standard, but doesn't help since we need an rgb byte array, and .jpeg just means we'll have to consume CPU in order to transform it to rgb byte array. doesn't help when you have to produce at least 20fps.

Image formats are device dependent. You will need to find out what formats are supported by the device using isOutputSupportedFor (docs). If the device does not support rgb then I would recommend converting the images from the imageReader to rgb using renderscript (e.g. ScriptIntrinsicYuvToRGB link)

So yeah, not an easy task if you're not hitting a device that works with RGB.

Sure, I could work on a demo that works with just a couple of devices (ones that support RGBA_8888), but then we'll have a lot of issues on github. And producing a renderscript demo is currently too time consuming for me.

Also, getting an rgb byte array from camera2 isn't really an issue for these forums, more like stackoverflow.

So that's where we are right now:

And I haven't even mentioned performance :)

xdeop commented 7 years ago

Hi @kmlx you've stated that you can fully access rgb byte arrays for any device using camera1 API... How it is done? I'm using ffmpegframerecorder with camera1 api (onPreviewFrame) and with a Nexus 4 I'm getting green frames...

Thanks.

kmlx commented 7 years ago

@xdeop green frames means the resolution or the camera settings are not supported. choose the right resolution/settings and it will work. you can find how to choose the right resolution on stack overflow.

xdeop commented 7 years ago

Ok. Thanks. I'll give it a try.

Has someone accomplished migrating the RecordActivity example with the Camera2 api?

Thanks.

2017-04-18 10:14 GMT+02:00 Adrian Stanescu notifications@github.com:

@xdeop https://github.com/xdeop green frames means the resolution or the camera settings are not supported. choose the right resolution/settings and it will work. you can find how to choose the right resolution on stack overflow.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/bytedeco/javacv/issues/163#issuecomment-294724050, or mute the thread https://github.com/notifications/unsubscribe-auth/AE_FrvdFCMR5a32b3FVvZWHNYU7NVUt1ks5rxHDlgaJpZM4FGM3V .

-- Xavi Deop