google-ar / arcore-android-sdk

ARCore SDK for Android Studio
https://developers.google.com/ar
Other
4.98k stars 1.22k forks source link

Feature Request: Provide Camera Intrinsics #112

Open nbsrujan opened 6 years ago

nbsrujan commented 6 years ago

AR-Core doesn't seem to provide us Camera Instrinsics matrix. Tango and AR-Kit provides Intrinsic and distortion parameters. All Photogrammetry or Marker based applications require Camera Instrinsics. If they are provided by AR-Core, how can we access them?

inio commented 6 years ago

This is definitely on our radar. For now you can back out a pinhole model from the projection matrix and transformed UVs, but we hope to provide a directly-defined pinhole model with distortion parameters in the future.

eisenbruch commented 6 years ago

@nbsrujan or anyone at google, is this also on your radar for the ARCore Unity SDK and the other platforms as well?

inio commented 6 years ago

@eisenbruch I've asked and we expect the Unity and Unreal engine integrations to offer feature parity on this topic.

andrdmi commented 6 years ago

As far as I understand the matter using the projection matrix to derive a pinhole model is problematic since ARCore adjusts it for crop before returning to the app. This effectively scales one of the axes and cannot be corrected since the app doesn't know the actual values of fx and fy. Am I missing something (if so could you please provide a hint)?

inio commented 6 years ago

@andrdmi That's why I mentioned "and transformed UVs". Frame.transformDisplayUvCoords() lets you discover the current crop factor.

Note: transformDisplayUvCoords may not use the entire 0-1 texture range, even when the texture is not cropped. Some android camera drivers always generate a square texture for example.

andrdmi commented 6 years ago

@inio Indeed, thanks!

ilamanov commented 6 years ago

@inio ,@andrdmi projection matrix gets us to normalized device coordinates. Could you please explain how we can use transformDisplayUvCoords to transform normalized device coords to window coords? I am confused because the output of transformDisplayUvCoords seems to be in 0-1 range and I was expecting the output to be in pixel coords. Thank you!

inio commented 6 years ago

@namnov Apologies for the super pseudocodey logic below, but hopefully it's enough to follow:

First call transformDisplayUvCoords with the point set (0,0), (1,0), (0,1) and call the output point set (a,b),(c,-),(-,d). where - is a don't-care. From this you can form the matrix M_uv_disp

c-a   0    0    a
 0   d-b   0    b
 0    0    1    0
 0    0    0    1 

which transforms normalized display coordinates into UV coordinates.

now you just need to do M_uv_camera = M_uv_disp * scale(0.5) * translate(1, 1, 0) * proj_matrix. The scale and translate are needed to convert clip-space coordinates (-1,1) into normalized display coordinates (0,1).

ilamanov commented 6 years ago

@inio Thanks!

iBicha commented 6 years ago

Seconding this. If ARCore can just give us the calibration file it is using, would be nice.

klausw commented 6 years ago

I know it's an old issue, but I think that @inio's example is incomplete for cases where the screen is rotated. For that, you need the off axis elements too, using "g", "h" instead of "don't care" values.

So given outputs (a,b),(c,g),(h,d), the matrix would be something like this:

c-a  h-a   0    a
g-b  d-b   0    b
 0    0    1    0
 0    0    0    1 

This is untested and may be in the wrong order, I just wanted to follow up in case it helps someone.

rmonroy84 commented 6 years ago

Trying a few things inspired by the first step described here, I was able to find the intrinsic parameters for the supported devices in the ARCore APKs. You need to get the apktool and run this in the same folder where you downloaded the ARCore APK (v1.2 in this example):

apktool -q d -s -r ARCore_1_2.apk -o ARCore_1_2

This will create a folder called ARCore_1_2, then go to ./lib/arm64-v8a and unpack libdevice_profile_loader.so (with 7zip for instance) into a new folder for convenience. Then, open the file .rodata in the newly created folder with a text editor. Most of the file's content is formatted as an XML, just look for the name of the phone you're working with and you'll find the intrinsic parameters within the tag <camera>. There are also some IMU calibration data and extrinsics, but it's not clear how are they're used.

I haven't tested the parameters myself, but will do in the next few days...

fengyang0317 commented 6 years ago

@inio I find the function you mentioned in https://github.com/google-ar/arcore-android-sdk/blob/master/samples/hello_ar_java/app/src/main/java/com/google/ar/core/examples/java/common/rendering/BackgroundRenderer.java#L133. However, c-a and d-b are 0 or 1e-8 when I run the code on Google Pixel. Is it normal?

inio commented 6 years ago

ARCore 1.3 adds access to a simple pinhole model for both the GPU texture and CPU image. It does not include distortion parameters yet so I'm leaving this open

ghost commented 6 years ago

@rmonroy84 do u know how to use imu calibration from arcore

inio commented 6 years ago

IMU↔︎camera extrinsics are now available (once tracking starts) by comparing Frame.getAndroidSensorPose and Camera.getPose (but note bug #535).

rmonroy84 commented 6 years ago

@gao-ouyang I've only used the distortion coefficients & intrinsics for the camera. On the file I mentioned here there are some tags related to IMU intrinsics (b_w_b_a, intrinsics, gyro_noise_sigma, gyro_bias_sigma, accel_noise_sigma, accel_bias_sigma), but I wouldn't know how to use them.

claywilkinson commented 5 years ago

I believe this issue can be closed now since the camera intrinsics are now available. Please open a new issue if you still have questions about how to use these APIs.

inio commented 5 years ago

Reopening as original request for distortion parameters has not been resolved.

alexs7 commented 4 years ago

@inio Are distortion parameters obtainable now ?

gblikas commented 4 years ago

@inio Still relevant in 2020.

This is a possible linked issue with the current method via ARCore: https://github.com/google-ar/arcore-android-sdk/issues/836

alexs7 commented 4 years ago

@gblikas Hello, I went ahead in my case and assumed the CPU image is undistorted. Are you using that same aswell?

gblikas commented 4 years ago

@alexs7 No, I am not. For my application, that is not a good assumption for us to make. My hope was that ARCore themselves are un-distorting the image their API and exposing that; everything I read implies that ARCore doesn't manipulate the Texture (which they shouldn't).

However, I am not sure why exposing distortion coeffs has been put off for so long. It's such an essential feature in computer vision that it makes little sense for it not to be one of the first things surfaced.

gblikas commented 4 years ago

@alexs7 It also seems like the arcore-for-all project, was also accepted with zero consideration for the accuracy of the AR in question for "all" devices. There is also no warning about this, other than visual inaccuracies.

alexs7 commented 4 years ago

@gblikas I am working on localization against an offline map and I am using the CPU image, which is 480 by 640 in dimension. I think the reason that I didn't use the Texture which is because sending a 1080 by 1920 image to a server caused some delays.

I am not sure but is this something you might find relevant ? https://developers.google.com/ar/reference/java/arcore/reference/com/google/ar/core/ImageMetadata#LENS_RADIAL_DISTORTION

I remember I was looking at distortion aswell and found that but I couldn't get my head around those LENS_RADIAL_DISTORTION params . If I got this right those are for the actual camera frame but in ARCore you get a CPU Image or a Texture...

gblikas commented 4 years ago

@alexs7 I tried the LENS_RADIAL_DISTORTION ImageMetadata accessor way back when it first came out. When it was first implemented, I was only every receiving 0's from it; has this changed?

alexs7 commented 4 years ago

@gblikas It's been ages I don't even remember trying it to be honest

aashish2000 commented 3 years ago

Hi, I'm new to ARCore and I wanted to know how to extract the Camera's Intrinsic Parameters. I tried using LENS_INTRINSIC_CALIBRATION but that just returned an array of zeros. I tried using camera.getImageIntrinsics() for extracting the intrinsics of the CPU Image but this does not provide me the axis skew parameter which is required by my application. How do I extract this parameter?

pawegio commented 3 years ago

I'm using CameraManager.getCameraCharacteristics(), but it returns valid lens characteristics only for Pixel devices...

Basel-Salahieh commented 3 years ago

@devbridie, any update on this? LENS_RADIAL_DISTORTION is not working and no clue how to access the distortion parameters in ARCore API. This will be very useful feature since texture images returend by ARCore are highly distorted (at close distances).

alexs7 commented 3 years ago

@Basel-Salahieh can you use the CPU image ?

Basel-Salahieh commented 3 years ago

@Basel-Salahieh can you use the CPU image ?

I'm using the following to get access to them but getting MetadataNotFoundException ImageMetadata imageMetadata = frame.getImageMetadata(); float[] lensRadialDistortion = imageMetadata.getFloatArray(ImageMetadata.LENS_RADIAL_DISTORTION);

Is there another way to access it?

emaschino commented 2 years ago

Strange: retrieving camera intrinsic parameters with ACAMERA_LENS_INTRINSIC_CALIBRATION metadata gives 0 for all parameters whereas ArCameraIntrinsics gives sensible values, both for the CPU image and GPU texture. Is this expected? This is on OnePlus 7 phone device. I can't comment on camera distortion parameters: ACAMERA_LENS_DISTORSION metadata isn't supported (ACAMERA_LENS_RADIAL_DISTORSION was deprecated by the former).

GEllickson-Hover commented 1 year ago

Does anyone have an update on this? I still receive MetadataNotFoundException when querying for ImageMetadata.LENS_RADIAL_DISTORTION on all 3 of my devices.