google-ar / arcore-android-sdk

ARCore SDK for Android Studio
https://developers.google.com/ar
Other
4.91k stars 1.2k forks source link

Feature Request: Dense Pointcloud from depth sensors or stereo #120

Open SimonScholl opened 6 years ago

SimonScholl commented 6 years ago

I know there were already several requests related to the possible support of Tango Devices with ARCore. My question is based on the fact, that we want to use the already available depth sensors to capture dense depth data with our devices. So is there any plan (short, mid or long term) that the ARCore will use the hardware capabilities of Tango Devices?

Is it possible that ArFrame_acquirePointCloud() not only can deliver depth data about features, but also gives us the pointcloud we already got via tango, when hardware is there?

inio commented 6 years ago

This is definitely something we could do, but currently ARCore doesn't even run on any devices with depth cameras so it's a bit early to even think about.

helipiotr commented 6 years ago

Copied from Google Tango homepage:

In addition to working with our OEM partners on new devices, Google is also working closely with Asus, as one of our early Tango partners, to ensure that ZenFone AR will work with ARCore. We'll be back with more information soon on how existing and new ZenFone AR users will get access to ARCore apps.

It seems that at one point ARCore will be running on ZenFone AR. It would be a great feature for the existing hardware.

SimonScholl commented 6 years ago

Yeah i already read that, but to clear things up. This feature request is not about "running ARCore on Asus", that's what i expect. It is more about getting dense pointcloud data from devices which have a hardware component like a Time of Flight Sensor, so ARCore would offer us the best of all development possibilities.

jonomacd commented 6 years ago

I see the ARCore for the zenfone is eminent. Is this going to have point cloud support? Will the release deprecate the point cloud support that it currently has? I use matterport scenes all the time, I want that to still work but I would also like the new features of ARCore.

How is this going to work?

inio commented 6 years ago

Sorry, not yet. Point clouds on the ZenFone are derived from visual features, just like on other ARCore phones.

SimonScholl commented 6 years ago

I hope it doesn't override our tango core on the framework, til the support for dense depth data from related sensors, there should be no one be forced to use ARCore. Especially a lot of developers need those data, for exsiting and upcoming applications.

inio commented 6 years ago

ARCore DP2 and later can coexist with TangoCore.

kevhill commented 6 years ago

Any update on this with regards to Snapdragon 845 / Galaxy S9 compatibility being unveiled "within weeks"?

inio commented 6 years ago

@kevhill Wrong bug? (I think you meant #250)

kevhill commented 6 years ago

No, but certainly lacking clarity.

I meant that one of the features Qualcomm is promoting for the 845 is accurate and dense point clouds based on dual cameras as opposed to IR (through the Spectre 280 ISP). As the S9 has all of the required hardware, and in theory the 845 has software support for that, it seems like this should be a straightforward feature to expose through ARCore.

kevhill commented 6 years ago

If anyone hasn't seen the demo video, it is pretty sweet. https://www.youtube.com/watch?v=16vz3_6-tbM

inio commented 6 years ago

Ah, sorry, didn't read that clearly. Makes sense now.

No, no update on dense depth from stereo. Updated FR description to include.

lvonasek commented 6 years ago

@inio Any update on this? Asus Zenfone AR is already supported few months.

This app is waiting for it: https://play.google.com/store/apps/details?id=com.lvonasek.arcore3dscanner

inio commented 6 years ago

@lvonasek Nope.

Thaina commented 5 years ago

How is this progressed?

I think almost all device which has ARCore available contain stereo camera. So the point cloud should be better handled

lvonasek commented 5 years ago

In this video https://youtu.be/7ZSm95naghw?t=127 is a 3d scanner using depth sensor on non Tango device. Is it based on ARCore? Or shall we migrate to another technology?

SimonScholl commented 5 years ago

In this video https://youtu.be/7ZSm95naghw?t=127 is a 3d scanner using depth sensor on non Tango device. Is it based on ARCore? Or shall we migrate to another technology?

Hi, i also had an eye out on the new oppo. As far as i know this is not based on ARCore, it is even not a open development kit available to use this depth sensor for new apps. I still hope ARCore will offer tango-like functions for phones with the right hardware components, as depth sensors get more common on android devices.

lvonasek commented 5 years ago

It is written in #603 that they are not going to implement it in the near future. About Oppo I watched the same presentation in more languages. Here was at least some informations shared: https://youtu.be/-Vz5US3vt5E?t=3274

SimonScholl commented 5 years ago

It is written in #603 that they are not going to implement it in the near future. About Oppo I watched the same presentation in more languages. Here was at least some informations shared: https://youtu.be/-Vz5US3vt5E?t=3274

Thx for the link informative video, the question is what does google understands in 'near term', that could be months we have to wait or a year. My hope is that as hardware enabled devices will raise again and the demand of our developer community using that sensors, will force them to give this topic a higher priority.

They already have the knowledge integrating the sensors like they did with tango, maybe it was not perfect but it was working. So why not bringing together the best of both worlds.

lvonasek commented 5 years ago

The demand from the developers is quite big. Everytime I talk with someone about ARCore I hear words like disappointment, useless, etc... Tango was working well in bussiness area, thats why I tried this request: #638

Thaina commented 5 years ago

@lvonasek It not really big at all. Just a computer vision with stereo camera is enough for everything we all need. It was just a stupid paradox decision of ARCore team which I don't know who have think about it

The point is ARCore always limit device it could be run on by no reason. Almost all of it's device has stereo camera like it required to yet they don't utilize stereo camera while it should. ARCore could be run on normal phone with single camera but they just said the quality of those phone is not acceptable for them. However the quality of ARCore that run on my phone right now even the measure app from google is not acceptably accurate or usable

lvonasek commented 5 years ago

@Thaina Difference between structure from motion with single (current implementation) and pasive stereo camera is that stereo has known transformation from the first photo to the second photo. Then you can reach much better point cloud (accurate). But there would be the same problems like with current implementations - white walls wont be for the device visible. Also occlusion or collision with real world would not work better. Thats why I am saying that demand on depth sensor is quite big - because other systems are just compromises, not full solutions. Of course stereo-camera solution would be better but Google cannot make Google's Pixel rivals better than own device. Maybe Google will solve it later using AI but I am not aware of any ready-to-production system for it.

And about limiting device support - there is really good reason for it. Computer vision needs camera calibration. And Google needs IMU calibration. Google has a lot of work with every supported device (calibration all possible device variants, device firmware fixing, firmware update, whitelisting the device on Google Play). I would say we need patience, Google has enough work to do.

lvonasek commented 5 years ago

https://www.androidauthority.com/samsung-galaxy-s10-tof-sensor-927054/ "A rear-facing TOF sensor could provide a more accurate augmented reality experience than ARCore"

Would not be better for Google to bring depth sensor standard instead of letting Oppo, Samsung, Huawei and others create own standards? I can imagine that without bringing depth sensor support into ARCore there would be in the future more SDKs and you wont be able to use it together with ARCore because of the locked camera.

mpottinger commented 5 years ago

Yeah with more TOF equipped phones coming out, this will be a must have. AR without dense depth is just half baked imo. I can't make good use of it. I am forced to use the zed sdk with the zed stereo camera for now to get depth and occlusion. Not ideal for me either, but all i can find for now.

More and more tof camera phones will be coming. AR will only make sense when it takes advantage of that in my opinion. I await that day, and when that day comes I will get excited about AR on phones.

Thaina commented 5 years ago

White wall is known that we can't achieve but depth from stereo can solve many more things even without true depth sensor

You can't expect manufacturer to put depth sensor just for AR. It not practical usage for normal customer. But stereo camera is, it would be used in photography and that's all we need

You are underestimate the depth imaging. Just depth image from stereo camera would be exponentially useful. Just the ability to not need to move camera around to start AR session is precious. And it can make a guess on even white wall if we could make a depth on edge between wall and floor. There was so many more usage if we could put reliable depth image in AR

lvonasek commented 5 years ago

@mpottinger - Huawei already confirmed that ToF support is coming into ARCore (it was during Huawei P30 presentation in Paris). The ToF sensor runs on the same framerate like RGB camera and it can detect objects in range 1cm-4m. I expect that it will be introduced on Google IO on the 7th May.

@Thaina even stereo cameras cannot detect white walls like ToF camera. Anyway I made wall detection which is working on all Android ARCore devices: https://www.linkedin.com/feed/update/urn:li:ugcPost:6512251490525020160 (You can download it from Unity Asset Store).

But what makes me more wrinkles is that Google is decreasing amount of support here (there are topics unanswered very long). This is the same behaviour they did shortly before ending with Google Tango (which was in summer 2017).

Tango was killed after Apple introduced ARkit. Will be ARCore killed because Microsoft introduced Hololens 2?

mpottinger commented 5 years ago

@lvonasek That is really good to know, thanks!

Also there are two software only based approaches by Facebook research and Google that look really promising and are supposed to work at camera framerates.

The Facebook approach is open source but only a slightly crippled Python reference version. The parts that make it real-time are left out of the Python code. By default it takes minutes per frame. I was able to optimize it and best i could achieve was 12 FPS at very low res with the Python code, but it was proof to me that it can work.

https://github.com/facebookresearch/AR-Depth

The Google approach is similar but no open-source code available. Would be great if they support tof devices and then use the software approach on other devices later.

SimonScholl commented 5 years ago

I totally agree with @lvonasek , can you link some source for the Huawei statement?

For me and people i know ARCore is only something to play with. The capabilities are that limited it can be used for many products, even more real depth data via TOF was more meaningfull. As they announced ARCore i hope it will be the best out of all worlds, but as it seems we cannot expect big changes in the future to this topic.

Thaina commented 5 years ago

@lvonasek It's not that I don't wish that all the phone would have ToF depth sensor. I just say that it was not practical to have it in most customer phone. Only some phone will add this feature as a bonus. It would less likely to have wide adoption

Surely it will eliminate almost all of AR problem but it don't have much demand unlike multi camera that used in normal photography. So I think we better not put our hope for ToF and should put more effort in Dual Camera unless there was a demand for ToF in other field (for what? I don't know, interior mapping maybe?)

lvonasek commented 5 years ago

@SimonScholl - https://youtu.be/2xMa3UZbRUU at 1:36.08

@Thaina - I do not believe in massive adoption of ToF sensor or stereo camera (of course there are more phones with stereo than ToF). Anyway I expect that AR on phones have no long future but that's just my opinion.

Thaina commented 5 years ago

@lvonasek Stereo camera already has massive adoption. And this trend would never drop. Because it has demand in photography and many normal people use it outside AR. People use camera more than AR in reality. So utilizing stereo camera that already exist nowaday is better option

mpottinger commented 5 years ago

@lvonasek yes i also disagree because photography has a use for it, both stereo and especially tof are growing rapidly in adoption on flagship phones now.

AR on phones might not be the best or preferred option in the future but it won't go away because of dedicated headsets. It is simple to keep simplified phone AR as an option while also developing advanced headset AR.

lvonasek commented 5 years ago

@fredsa Any updates on this?

kexar commented 5 years ago

When is a ToF support presented by Huawei coming? I thought we will learn more on I/O but no word about it :(

jdesbonnet commented 5 years ago

Yes, this is disappointing. I thought this would be announced on day#1 of Google IO 2019. I'm holding back purchasing my next device until I know it has a ToF camera I can access.

mpottinger commented 5 years ago

Yeah kind of disappointed. Maybe it will come one day but I am not holding my breath now and will need to come up with a software based solution.

All I need from the dense depth is accurate placement of objects on uneven surfaces without plane detection.

This sort of works by just using hit tests on the feature points as anchors and disabling plane detection, but not well enough.

I need to place the objects instantly wherever I want them and features are not always there. When they are there its still sometimes wildly inaccurate. I get an object way behind, above, or below where it was supposed to be.

I am hoping some kind of single pixel depth classifier with a neural net can pull that off.

Instead of full depth frames just have a neural net infer the depth of a point on the screen on demand.

Some kind of very coarse occlusion would be ok too. Just detect whether a pixel at the centre of an object is occluded and show/hide the objects based on that.

lvonasek commented 5 years ago

@kexar - currently the firmware versions of Honor/Huawei devices supporting ToF are only in China (they have an AR Measure app using ToF sensor), more I do not know

@mpottinger - maybe this could be something for you: https://www.linkedin.com/feed/update/urn:li:activity:6512251608032641024

mpottinger commented 5 years ago

@lvonasek Really nice! Yes that is along the lines of what I need.

This is similar to what 6D.ai has achieved, but only on IOS. Been waiting for them to announce Android support for months.

Tried the demos, the accuracy unfortunately doesn't seem to be enough for my application. 6D seemed to be just barely accurate enough but is still IOS only.

Maybe if it had an external depth sensor for input like the structure core, it would work.

I know depth from mono is difficult though. Thats why I am thinking maybe only calculate the points that I need instead of full frame or mesh.

jdesbonnet commented 5 years ago

Apologies if this query is a little off topic from the original question: but now that Google IO 2019 has passed and there seems to be no way to access ToF raw data from ToF camera enabled android devices, can anyone recommend some low cost (ideally less than $500) ToF cameras that has an open SDK? I'm looking for a resolution of 80x60 at the very least.

lvonasek commented 5 years ago

@jdesbonnet - There is a Camera2 API which makes possible to read ToF data, on my Honor View 20 it returns two resolutions: -20x15 - this works correctly -240x180 - this returns nothing

I guess it is just question of the device firmware update...

jdesbonnet commented 5 years ago

Oh I didn't know that. So I presume it returns a greyscale image with the value = range/dept. Can anyone verify if this is also the case with the S10/S10e or the Huawei P30 pro? (two android devices on my short list for ToF experimentation). A colleague of mine has the P30 pro... is there any app on the playstore we could use to check out if there is data from the ToF camera? Thx.

jdesbonnet commented 5 years ago

We got a dump from Camera2 API Probe app in playstore ( https://play.google.com/store/apps/details?id=com.airbeat.device.inspector ) for the Huawei P30 pro. It lists 5 camera devices which seems right. But not clear which (if any) is the ToF camera. They all list resolutions way beyond what the ToF camera does (640x480 res as I understand it).

Huawei_P30_pro_camera2_api_probe.txt

The Camera2 API probe app output from the Samsung S10e:

Samsung_S10e_camera2_api_probe.txt

It lists 4 camera devices, but again the listed resolutions are all way in excess of what a ToF camera would provide, unless these are scaled for developer convenience.

lvonasek commented 5 years ago

@jdesbonnet - this app can list the depth camera resolutions: https://play.google.com/store/apps/details?id=com.hvt.camera2test

There is about ten smartphones with ToF sensor. I tested Oppo R17 Pro And there the app returns no information about ToF sensor. If I find a bit time then I will make a ToF data viewer.

jdesbonnet commented 5 years ago

Yes, a data/image viewer would be very handy. There are a few apps on the playstore that probe the camera capabilities, but I can't find any that can grab an image from a specific camera at a specific resolution/setting.

mpottinger commented 5 years ago

@jdesbonnet The structure core sensor looks like a good option. That is what I am going with as it seems like the best option out there for my use case.

It is not a TOF camera it is stereo structured light but looks awesome.

Accessing raw tof data on a phone would be great too, but I am in a rush to get something working and it needs to be synched to the RGB data as well. Structure sensor takes care of all that plus has pose estimation.

lvonasek commented 5 years ago

Here is my ToF data viewer: https://drive.google.com/open?id=19IyI3M3DL5_iF7Okf6FFlQPBn6JSawUX (on Honor View 20 it works with resolution 20x15)

Note that 2019 smartphones with ToF sensor have the depthmap synced with RGB however the capture distance is very low.

jdesbonnet commented 5 years ago

Thanks for that. I checked with the Samsung S10e with the above APK: "Camera2 API: ToF not found". I'll check the Huawei P30 pro tomorrow.

mpottinger commented 5 years ago

@lvonasek Thanks! Don't have a phone with a tof sensor yet. Wow, that is really low resolution.

I also didn't realize at first these would be very short range on phones.

Structure Core sensor it is then!

jdesbonnet commented 5 years ago

@lvonasek just to let you know that works with the Huawei P30 pro :-) The highest resolution we got it to work with was 280x140. I don't suppose you can share the source code for that? (no worries if not, it seems straight forward enough if it's just the Camera2 API). Thanks for your work on this!

tof

lvonasek commented 5 years ago

@jdesbonnet - Cool, can you maybe make a video of it? (showing framerate, viewdistance, etc..)

As a "thank you" for the video, here is the source code of the app (the code is not so nice as I made it for testing purposes only): https://drive.google.com/open?id=1Fa1a7oBPH2E5wlDKTV3ATmaJDoMkRnXe