Closed unageek closed 1 year ago
Did you try this?
or
You can duplicate your pipeline to keep your reconstruction
Thank you for your response! Since now I'm interested in the raw dense point cloud from the Meshing node rather than the mesh, I have tried the second pipeline. However, the Meshing node resulted in an error [fatal] Depth map fusion gives an empty result.
Could you check to put the sfmTransform in input of the preparedensescene? BTW if you only want the dense Point cloud, there is an option on the meshing node to save the raw dense point cloud.
Could you check to put the sfmTransform in input of the preparedensescene?
Thank you for the suggestion, it is working without any problem!
Now I am trying to convert densePointCloud_raw.abc to PLY format with a ConvertSfMFormat
node, but the output precision is too low. Here is an excerpt:
-3.74402e+06 3.663e+06 3.62752e+06 8 15 7
-3.74402e+06 3.663e+06 3.62752e+06 89 114 23
-3.74401e+06 3.66299e+06 3.62751e+06 114 147 40
-3.744e+06 3.66301e+06 3.6275e+06 7 9 4
-3.744e+06 3.66301e+06 3.6275e+06 31 40 19
I have also tried JSON format, but the precision is not enough either:
{
"landmarkId": "4985808",
"descType": "unknown",
"color": [
"164",
"169",
"173"
],
"X": [
"-3743836.75",
"3663055.25",
"3627634.25"
]
},
Is there a way to increase the output precision? Wait, it looks like the coordinates are in single precision...?
if you only want the dense Point cloud, there is an option on the meshing node to save the raw dense point cloud.
Yes, I have turned that option on already. It would be nice to have an option that stops the Meshing node as soon as the point cloud is saved.
I inserted stream << std::setprecision(17);
before the next line:
and here is what I have got:
-3744018.25 3662998.25 3627522.25 8 15 7
-3744016.5 3662997.5 3627518.75 89 114 23
-3744011.5 3662994.75 3627508.75 114 147 40
-3743997.5 3663005.75 3627504.5 7 9 4
-3743995.75 3663006.5 3627503 31 40 19
Apparently, the coordinates are rounded to single precision before conversion.
Hi, I'm closing this issue as it has been resolved by alicevision/AliceVision#1486 (thanks for your contribution, @unageek!).
Describe the problem
I am trying to transform the reconstructed mesh to the world coordinates (ECEF). Since my photos contain GPS data, the required transformation should be estimable by a
SfMTransform
node with thefrom_gps
method.The node is successful and I get the following information in its log:
However, this result is not usable since too few digits are displayed, especially for the translation part. How do I obtain the precise coefficients estimated by the node? I am also not certain the order of scaling/rotation/translation to apply.
Screenshots
Desktop (please complete the following and other pertinent information):
Additional context
If I try to use the output of the
SfMTransform
node as the input to theMeshing
node, the node fails with an error[fatal] No camera to make the reconstruction
, probably due to too large coordinates. Thus that is not an option.