google-ar / arcore-android-sdk

ARCore SDK for Android Studio
https://developers.google.com/ar
Other
4.91k stars 1.2k forks source link

Feature Request: Dense Pointcloud from depth sensors or stereo #120

Open SimonScholl opened 6 years ago

SimonScholl commented 6 years ago

I know there were already several requests related to the possible support of Tango Devices with ARCore. My question is based on the fact, that we want to use the already available depth sensors to capture dense depth data with our devices. So is there any plan (short, mid or long term) that the ARCore will use the hardware capabilities of Tango Devices?

Is it possible that ArFrame_acquirePointCloud() not only can deliver depth data about features, but also gives us the pointcloud we already got via tango, when hardware is there?

jdesbonnet commented 5 years ago

@lvonasek apologies for the shaky nature of of the video (trying to two phones isn't too easy :-) ) , but I hope it illustrates resolution and frame rates: https://youtu.be/295Twtb0uO0 (for anyone not up-to-date with the last few messages, this is the Huawei P30 pro running the ToF camera demo which @lvonasek posted above)

mpottinger commented 5 years ago

@jdesbonnet Thanks for the demo.

Definitely seems quite limited to me. Maybe that explains why no support in ARCore yet. It's still early days for the hardware side.

This is enough to convince me that these sensors are not suitable for my purposes yet.

chucknology commented 5 years ago

Hi all, I'm replying from the official Chuck Knowledge account. Post I/O it should be clear what it is and now there's no embargo on information. We're moving depth capability development ahead without Huawei. It's delicate and Googlers can't speak to it so please don't put good people in a bad situation. As always, you can talk on the Slack and if you need to ship you can contact me directly chuck@knowledge.wtf

kexar commented 5 years ago

@lvonasek Thanks for your ToF data viewer, working for me on P30 Pro. Would be cool if you could make it as an unity plugin.

@chucknology Too bad, I bought P30 Pro because of ToF integration into ARCore :(

lvonasek commented 5 years ago

@kexar if Google adds Shared Camera functionality for ARCore NDK then I would be able to enable ToF sensor in my 3D Scanner for ARCore even on Huawei devices. However I am not sure if I can do it in Unity because there I would have to change the ARCore initialization.

Sure it is possible to add ToF into Unity without AR but there I do not see any usecase.

sethhaller commented 5 years ago

Really want to Dev for ToF again. Working with Tango and the Zenfone AR was the funniest thing I have done. What you could do with Tango blows ARCore out of the water. (My test App - https://youtu.be/z_seJgR5CCA). Really really hope ARCore starts to implement Tango functionality with the reemergence of rear facing/ world facing ToF. Want to develop for the Samsung 5G ToF but cannot find any way to do so.

ar-ml commented 5 years ago

@chucknology Do I understand the situation right: Honor View 20 and Huawei P30 Pro will never get ToF support with ARcore because of Mr. Trump? So propably Huawei will make ToF data available through their "HUAWEI AR Engine" at a certain point... Is the Samsung S10 5G now the the best chance for ToF support with ARcore or is there another (better / cheaper) device? Or do you think ToF support now has moved to a very low priority within the google team?

marklinmao commented 5 years ago

Apologies if this query is a little off topic from the original question: but now that Google IO 2019 has passed and there seems to be no way to access ToF raw data from ToF camera enabled android devices, can anyone recommend some low cost (ideally less than $500) ToF cameras that has an open SDK? I'm looking for a resolution of 80x60 at the very least.

I found here is a external ToF camera for mobile phone. From Tmall it's around 130 USD. They provides SDK, but not sure how good it works. Let us know when you tried https://detail.tmall.com/item.htm?id=581766178950&spm=a1z09.2.0.0.12e72e8dU4jVuo&_u=39n3ahn8440&skuId=4223523143092

This is the spec info from their official site: http://www.myntai.com/dev/mynteye_mobile

SDK/sample: https://drive.google.com/drive/folders/1wVp4xqqgjidPQyzzW1Tmibbw4yY5p4sv

image

lvonasek commented 5 years ago

Hi @jdesbonnet,

can you do for me another video? I've just made working ToF sensor in my 3d scanning app. However my Honor works only with resolution 20x15? Can you test it on your Huawei P30 with higher resolution? You have to check "Use depth sensor" in the scan dialog and then select the correct resolution. Here is the link: https://drive.google.com/open?id=1Y_dF5BZhLdnHTtEClCcQS7Yx6bCvfuAi

Thank you in advance Lubos

marklinmao commented 5 years ago

@kexar if Google adds Shared Camera functionality for ARCore NDK then I would be able to enable ToF sensor in my 3D Scanner for ARCore even on Huawei devices. However I am not sure if I can do it in Unity because there I would have to change the ARCore initialization.

Sure it is possible to add ToF into Unity without AR but there I do not see any usecase.

I download your apk on my P30 pro and it works well! So do I get it right that it is possible to retrieve the depth data (with Camera2 API, like this apk), but it's not possible to somehow 'feed' it to ARCore for better 3d-reconstruction ?

lvonasek commented 5 years ago

@marklinmao can you make a video how it works on your phone and share it with me?

I see that you quoted my older comment. I solved since then many problems and I think I can add it into my Unity 3D reconstruction asset in week or so (depends on the weather in Germany :D). But it won't be ready for releasing apps because of the limited device support (there are about ten ToF devices but currently I know only about Huawei P30 having Camera2 API working).

marklinmao commented 5 years ago

No problem, I will do it in a few hours.

Best regards, LIN.

Luboš Vonásek notifications@github.com 于 2019年6月3日周一 22:05写道:

@marklinmao https://github.com/marklinmao can you make a video how it works on your phone and share it with me?

I see that you quoted my older comment. I solved since then many problems and I think I can add it into my Unity 3D reconstruction asset in week or so (depends on the weather in Germany :D). But it won't be ready for releasing apps because of the limited device support (there are about ten ToF devices but currently I know only about Huawei P30 having Camera2 API working).

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/google-ar/arcore-android-sdk/issues/120?email_source=notifications&email_token=ABEAJ7MZUGWKS5D2VWH4F53PYVTPXA5CNFSM4ELEBRI2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODW2MEDQ#issuecomment-498385422, or mute the thread https://github.com/notifications/unsubscribe-auth/ABEAJ7P6MSIJQ5WQ6LAPZ2LPYVTPXANCNFSM4ELEBRIQ .

marklinmao commented 5 years ago

@lvonasek Here is the video on P30 Pro: https://vimeo.com/340064857 It support 240x180, but it seems it doesn't look quite OK. Just let me know when you have newer version, I can test for you.

lvonasek commented 5 years ago

Thanks @marklinmao. I am afraid that without buying P30 I cannot fix it. Does 20x15 work correctly? (That's the resolution which works on my Honor V20)

marklinmao commented 5 years ago

With 20x15, the camera can be open but the mesh is not created. Nothing happen. With other resolution, the camera gets frozen.

Luboš Vonásek notifications@github.com 于2019年6月3日周一 下午11:26写道:

Thanks @marklinmao https://github.com/marklinmao. I am afraid that without buying P30 I cannot fix it. Does 20x15 work correctly? (That's the resolution which works on my Honor V20)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/google-ar/arcore-android-sdk/issues/120?email_source=notifications&email_token=ABEAJ7KGD4BYOXL2BS22U33PYV46LA5CNFSM4ELEBRI2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODW2S3MA#issuecomment-498412976, or mute the thread https://github.com/notifications/unsubscribe-auth/ABEAJ7OSB657C6JY3SO2WADPYV46LANCNFSM4ELEBRIQ .

kexar commented 5 years ago

@lvonasek I tried it on P30 Pro too and had the same problems as @marklinmao

lvonasek commented 5 years ago

@kexar It is most likely some calibration issue in my code. I asked @marklinmao to try to fix my code but he is currently too busy.

lvonasek commented 5 years ago

Could someone test for me this version on Huawei P30 or P30 Pro? (I might fixed the previous issue) Just check "Use depth sensor" and select resolution 240x180.

https://drive.google.com/file/d/1-2q6cyZED8BnVCHHkwtzK2b5oj7sg-77/view?usp=drivesdk

kexar commented 5 years ago

@lvonasek I tried it. It worked, but the quality of 3d model seems comparable to ARCore version. I can make a video tomorrow...

lvonasek commented 5 years ago

@kexar Even that is success. Please make video with 2cm resolution, thank you

kexar commented 5 years ago

@lvonasek I made a video https://drive.google.com/file/d/1BziohrVFAO3K46-e-1bYEn9VoxUMkZ0f/view?usp=drivesdk

lvonasek commented 5 years ago

@kexar Thanks for the video. One thing I see there is that the wall estimation is enabled (that's wrong because ToF can do it more accurate), this I have to fix. Another thing is that I am using DEPTH16 access which should be used for occlusion only, I will use DEPTH_POINT_CLOUD as soon as it is on my device supported.

mpottinger commented 5 years ago

@marklinmao That looks like a stereo camera, not TOF. Won't be as precise, but will have higher resolution anyway.

I bought an active stereo camera, the structure core sensor. Works with windows, linux or Android. Realsense d415 is a good active stereo option as well.

These would be not of much help with ARCore, but it is very doable to make your own AR with Unity or another game engine using one of these cameras.

My application is not for general public consumption, so special equipment is no problem. ARCore just seemed like a convenient way to do it but without depth its a no go.

I am going to use an active stereo depth camera attached to a Surface Go tablet and do AR that way. With an Intel realsense T265 to do the position tracking in Unity and a Structure core for the depth.

It's an option if you need AR but aren't making apps for the app store.

ar-ml commented 5 years ago

@mpottinger what you describe is exactly what we wanted to do: using structure core or intel realsense together with a tablet device to build a proprietary high quality AR solution for our customers.

Problem: as far as I know, none of the existing SDKs from Occipital or intel support persistence. I had a skype with an Occipital product manager and a chat with someone from Intel support: Both say they might add "persistence" featurs (=saving / loading of area description files) one day, but it's not their priority. I think, Occipital currently only offers a basic SDK that provides depth data and IMU stream - looks like they are still working on improving their SLAM... while pesistence is far away on their roadmap!

If you have tested those sensors for AR purpose, I am thankful for more information. How good does it work? What about occlusion and SLAM accuracy, loop correction etc.?

Also, if anyone knows a solution for persistent AR experiences with one of the hardware sensors, help would be very much appreciated. If someone whants to build a "persistence extension" for one of the sensors, we would be willing to contribute to that. Thx!

mpottinger commented 5 years ago

@ar-ml I have only just started playing around with them, but so far it looks really promising.

I haven't tried to do any persistence yet but I have read that the T265 allows for loading and saving localization maps on the device.

Accuracy seems almost on par with ARCore, but relocalization is slower and fails more often if the camera is moved really fast, is shaken a lot, or moved in a twisting/twirling motion.

The big advantage for me is that you have access to direct pose information. In ARCore you have to use anchors because the pose estimate from ARCore is only valid in the current frame.

With the t265 you deal with the coordinates directly. So if you always started from the same exact physical location, you have easy persistence pretty much for as long as you want. This method is enough for me.

If that isn't an option, the T265 map saving might work but haven't tested it.

Yes the structure core just gives basic depth and imu. That's why I have the T265 for position tracking.

Occlusion from the depth map, I haven't figured out yet but I know there is a way to do it in Unity, I have read a description of how it is done but haven't seen any code.

Oh and almost forgot to mention that the T265 has a Unity demo for position tracking, so it is just a matter of implementing the color camera view background, and depth can easily be used for placing objects anywhere, not just where planes are detected.

Occlusion in unity, I have read it involves manipulating the z buffer but I haven't tried it yet

lvonasek commented 5 years ago

Hi guys,

yesterday I ported my scanner (again) to Huawei AREngine, this time the motion tracking works with constant coordinate system (not like last time). On my Honor View 20 I receive with AREngine more accurate motion tracking than with ARCore. Depth sensor is on AREngine officially supported, also I made it working in really short time and it works much better!

Can someone make a video how it works on Huawei P30 or P30 Pro? I need to test configuration: 2cm, without "poisson reconstruction", without "realtime gap filling" and with "use depth sensor".

Here is the version using Huawei AREngine: https://drive.google.com/open?id=1xpz383OHdDIVLVt3tuI-VOBp1WiuNQf5

Thank you in advance

ar-ml commented 5 years ago

@mpottinger thx for your reply! Would be great if you contacted me via email: info@ar-action.com I read this interesting Realsense T265 casestudy here: https://www.intelrealsense.com/visual-inertial-tracking-case-study/ I'd like to discuss some more details about its capabilities with you and I fear it's off-topic here. Also if you want, you could maybe do some tests for us.

ar-ml commented 4 years ago

For anyone who is interested in AR applications with T265: We have tested it last week and it's not an optionfor us right now for the following reasons:

  1. Android support for Realsense T265 is planned but not scheduled yet. (@mpottinger Or did you get it running on Android?)
  2. Unity Wrapper does not show any hint about saving and loading maps / persistence features, even though this should be there, according to some github posts and a video from intel.
  3. Tracking problems occure when rotating the device quickly.

We will wait untill Android support has been officially added and will re-evaluate it then again. In the meantime I am happy to hear from anyone who testet T265 with an AR usecase.

Related posts: https://github.com/IntelRealSense/librealsense/issues/4263
https://github.com/IntelRealSense/librealsense/tree/master/examples/ar-basic

AndrewRogers1994 commented 4 years ago

Is it still the case that you can't use TOF Sensors (For example the S10 5g is releasing a firmware soon that will provide access) with ARCore?

mpottinger commented 4 years ago

@ar-ml Sorry for my late reply. I mostly work with this stuff in my spare time and I haven't had much lately! Most of my recent work has been on the depth sensor integration with unity to get live occlusion working, object placement etc. Not so much on the tracking yet.

I am using a Surface Go tablet, since I am not planning on creating an Android app store app.

An option with Android is to use a rooted device if this is not for the app store. I have gotten the Google Edge TPU which is a Linux only device working on a rooted Samsung phone with no issues using Linux Deploy.

Once you have a Linux chroot on the phone you can compile just about anything that uses libusb instead of kernel drivers.

Haven't tried it but I expect the Realsense would work without an issue through that route on an Android device.

And yes the T265 is definitely sensitive to rapid movement, especially fast rotation. Not an issue in my case. The limitations in ARCore are more of an issue for me despite the excellent tracking.

mpottinger commented 4 years ago

@Murded No built in support at all. It should be possible to roll your own support though depending on what you need.

For example I need to be able to place objects instantly on any physical surface as a sort of marker. No need for physics etc. Getting the depth information directly and using that would be straightforward using the tof reader example posted here.

Occlusion using a depth map is possible with a custom shader on PC. I think Android limitations might make that more difficult. I have no idea how it would work.

AndrewRogers1994 commented 4 years ago

@mpottinger When you say "roll your own support" I presume you mean without using ARCore?

For example if my end goal was to get a dense point cloud, then I'd just use the android camera api to get data from the TOF sensor rather than somehow making ARCore use the sensor?

mpottinger commented 4 years ago

@Murded well it should be possible to use it with ARCore in a rudimentary way.

For example in my use case I only need to place objects in fixed positions. If you tap on the screen, get the xyz coordinate to place the object.

With ARCore alone and no TOF, I would have to use the sparse point cloud/features, which are sometimes unavailable on some surfaces. It limits where I can place fixed objects, and often very innacurate.

With TOF it would simply be a matter of creating an ARcore anchor at the distance given by the TOF sensor at a given point on the screen.

For occlusion I have heard custom shaders work in Unity on PC, but I haven't gotten that far yet to see if it would also work on Android.

AndrewRogers1994 commented 4 years ago

@mpottinger Ahh okay I see what you're saying now. Thanks again, shame that's not natively supported though

lvonasek commented 4 years ago

If someone want to test ToF functionality then here is an updated app for it: https://play.google.com/store/apps/details?id=com.lvonasek.tofviewer

anirudh-simha commented 4 years ago

Does it use camera2 api to get depth data?

lvonasek commented 4 years ago

Yes but not many devices support it. AFAIK it works only on Honor View 20 and Huawei P30 Pro

anirudh-simha commented 4 years ago

I don't have a Tof supported device yet..but do you get depth related data in different planes on the image in the imagereader?We will be getting some devices in the near future for a project using this feature.

lvonasek commented 4 years ago

It is just one plane in the ImageReader

anirudh-simha commented 4 years ago

Thank you much appreciated!

pacomibox commented 4 years ago

Hi @lvonasek , why the minimun resolution is 2 cm? It's a library limitation?

lvonasek commented 4 years ago

Hi @pacomibox, I am limiting it because the higher resolution currently does not work so good. But it should be possible to scan with 1cm grid soon (one of my contributor is on good way to solve issues connected with higher resolution).

yeongrok commented 4 years ago

Hi @lvonasek, I tried to test TOF function on Samsung Galaxy10 5G with your app (https://play.google.com/store/apps/details?id=com.lvonasek.tofviewer). But It didn't work with a message, "Camera2 API: TOF not found". Can you guess why? I'm sure the phone has TOF sensor.

Thaina commented 4 years ago

@yeongrok It possible that even hardware exist, the device or firmware itself not expose it to the google SDK. This happen to me with dual camera on Nokia 7 plus, it contain dual camera hardware not ToF, but there was not listed on the multiple camera capability of google sdk. The dual camera can only be accessed by Nokia owned bokeh camera app

lvonasek commented 4 years ago

Hi @yeongrok, it is exactly like @Thaina said.

yeongrok commented 4 years ago

@lvonasek, @Thaina, Thank you for your answer. I have more questions. I got 6 cameras(2 front, 4 rear) from camera2api, cameraManager.getCameraIdList(), on Galaxy10 Plus 5G and getOutputSizes(SurfaceTexture.class) returned null with a camera that is TOF camera in my guess. Does this mean the device or firmware itself not expose it to the google SDK ?? I'm very sorry for writing this question on this thread.

lvonasek commented 4 years ago

It would work with ImageReader only but S10 5g does not have it enabled yet. ADAIK currently only Huawei devices have enabled ToF functionality using Camera2 API.

yeongrok commented 4 years ago

OK, Thank you all so much for your answers.

AndrewRogers1994 commented 4 years ago

It would work with ImageReader only but S10 5g does not have it enabled yet. ADAIK currently only Huawei devices have enabled ToF functionality using Camera2 API.

Which Huawei Devices currently have TOF Enabled?

lvonasek commented 4 years ago

Honor View 20 and Huawei P30 Pro

Also the front ToF on Huawei Mate 20 Pro is working