google-ar / arcore-unity-sdk

ARCore SDK for Unity
https://developers.google.com/ar
Other
1.4k stars 400 forks source link

3D Reconstruction #228

Closed Godric877 closed 6 years ago

Godric877 commented 6 years ago

Hi,

I have been working on an app which would track and store the point cloud of its environment. I am then extracting that point cloud into a .obj file and displaying the point cloud using already available 3D viewers. I would be further trying to make a mesh from the point cloud and provide texture to the material using the camera photo to generate the final model.

I am having the following problems:-

1) The point cloud being provided by ARCore does not seem to be dense enough for performing 3D reconstruction, so is there a way to generate a dense enough point cloud?

2) The device I am working with is a Moto G6 Plus with dual camera, so is there a way I can utilise the two cameras to provide some extra information through ARCore which might improve the point cloud density?

3) Is there a way to get the point cloud world coordinates for all the feature points which are being displayed by ARCore? I am having trouble extracting reliable point-cloud information from ARCore. Since I would be needing the world point coordinates, in each frame I am raycasting from the screen in intervals of 100 pixels and if the TrackableHit is a FeaturePoint, I am placing an anchor at that position so that ARCore tracks the point and transforms it's pose as the camera moves through the environment. Then, on the click of a button, I am extracting all Feature point Trackables and writing the coordinates into a file on the phone which I am then visualising through a 3D point cloud visualiser app. However, I am not having any success with it.

Will ARCore support 3D Reconstruction as was possible in Project Tango?

Any help would be highly appreciated. Thanks :)

dzak83 commented 6 years ago

+1, we would also like to have spatial mapping as part of the Arcore. Mesh in our case doesn't have to be detailed.

pablisho commented 6 years ago

Hi, thanks for the feature request. ARCore currently does not support 3D reconstruction.

Godric877 commented 6 years ago

Can you please also comment on my other questions? It would really be a huge help and help me understand ARCore better.

pablisho commented 6 years ago

Sure, 1 and 2. Currently there's no way to input more info into ARCore to get a better or more dense point cloud.

  1. Pointcloud is already in world coordinates. I don't think you have to add a Trackable for each feature point, that sounds like an overkill. I just think every time the PointCloud is updated you could store those points using CopyPoints and then write them to the file or wherever you need. You may get some duplicates though.

Does that help?

Godric877 commented 6 years ago

Thanks! It does help.

I have some more questions/clarifications if you don't mind.

1) Does each frame have a different world coordinate system?

2) If so, then is there a way in which I can acquire the coordinates of all the points in a single coordinate system.

I intended to scan the environment and obtain a cumulative point cloud in a single frame (in a single coordinate system).

Can you please point me in the right direction for this?

pablisho commented 6 years ago

Each frame's pointcloud is in Unity World Coordinates. So the same feature point in two different frames will have the same position. No conversion is needed to do what you want, you just need to accumulate the point clouds over time and discard the duplicated points.

Godric877 commented 6 years ago

Thank you for the answer, but I am still encountering some problems using the information you mentioned above. Here is my code:-

`for (int i = 0; i < Frame.PointCloud.PointCount; i++) {

                m_Points[i] = Frame.PointCloud.GetPoint(i);
                float x = m_Points[i].x;
                float y = m_Points[i].y;
                float z = m_Points[i].z;

                string point_cloud = "v " + x.ToString () + " " + y.ToString () + " " + z.ToString ();
                sr.WriteLine (point_cloud);
            }`

sr is a StreamWriter in the above lines. The results I am getting do not seem to resemble the actual environment. I am attaching below the object I am analysing, and the visualisation of the point cloud. (I am using the app "3D Model Viewer" available on Play Store. They do not seem to resemble at all.

Please guide me as to where I am going wrong in my approach.

Also, could you also please let me know when should I be using Anchors ? Are anchors supposed to change their co-ordinates when the device moves ?

I am sorry to trouble you here, but I would be truly grateful for your help since I seem to be stuck here.

33828781_1359646474136373_5133506098006851584_n

33872642_1359646510803036_5087114949356945408_n

pablisho commented 6 years ago

I cannot speak for the visualization tool that you are using, but you may be having an issue with the coordinate system that the tool is using compared to Unity world space coordinates.

You use an Anchor whenever you want to anchor a virtual object to a fixed position in the real world. Whenever the devices moves, you should think that what is being moved is the camera, and the anchor is fixed to the real world. This guide should help: https://developers.google.com/ar/develop/developer-guides/anchors

Godric877 commented 6 years ago

Ok, Thanks a lot for your help! :)

dzak83 commented 5 years ago

A small follow up on the ARCore meshing as there might be soon a solution. Not perfect as it works on top of the feature points, so it will work only as good as points are but might be enough until there is a official fix from Google.

You can more at http://ARMeshing.com or watch here: https://www.youtube.com/watch?v=jEOl-CeI2Gg