Closed lectrician1 closed 1 year ago
Hi @lectrician1 -- First, I'll confess I haven't done a thorough look through of you data, but at first glance, this appears to be two different issues. For the first issue, we see a perfect 90 degree rotation. Are you specifying the correct axis as up on import? As you might know, the two most common coordinate systems are use different axis for up (Y and Z) and this causes confusion, for example, when importing certain meshes into blender and other 3D software.
The other issue appears to be the above + a possible GPS accuracy issue, and maybe a bad model orientation initialization as a result. These are tricky issues to fix, absent better GPS data.
Are you specifying the correct axis as up on import?
How do you do that?
How do you do that?
Blender has a checkbox on the OBJ import panel to do a 90deg rotation.
https://docs.blender.org/manual/en/3.5/files/import_export/obj.html
Oh I thought you were talking about there was a way to specify in ODM which orientation you want the output model to be. Is there any way you can or adjust it in ODM. Is it technically possible for ODM to determine the correct orientation?
Oh I thought you were talking about there was a way to specify in ODM which orientation you want the output model to be. Is there any way you can or adjust it in ODM.
We are compliant with the Wavefront OBJ specification, but different 3D modeling softwares and environments assume different "up" axes. This is just how things are, so Blender provides you with an option to do a rotation upon import.
Is it technically possible for ODM to determine the correct orientation?
Like Stephen mentioned, with wildly varying GPS data/inaccuracy with ground-level collects, this is not an easy problem to solve for. I've run into it numerous times myself with data captured with my cellphone.
Is it technically possible for ODM to determine the correct orientation?
Without location information (e.g. GPS) or absolute orientation (e.g. from a gimbal), not easily, no. Some heuristics could improve the chances of getting it right, or further user input could disambiguate the problem, but either approach will not generalize to all possible scenes and scenarios.
How did you install ODM? (Docker, installer, natively, ...)?
Docker
What is the problem?
When I process GPS-tagged (altitude, laditude, and longitude) equirectangular images that were taken from the ground on my bike, the orientation of the final model in Potree and what is downloaded is incorrect, making viewing the model particularly challenging since there are limits to the axis you can view around.
As a result, I am forced to manually re-orient the images.
Here is an example:
Google Drive with OBJ
The ground of the model, which is currently along the Y axis, should be along the plane created by the X and Z axis.
Here is another example from a different set of images (some overlap in images used):
Google Drive with OBJ
It also has a non-straight camera line this time:
And a final example (no overlap in images used):
And an example of what it's orientation should ACTUALLY be (corrected in Blender):
Google Drive with OBJ and reoriented OBJ
What should be the expected behavior? If this is a feature request, please describe in detail the changes you think should be made to the code, citing files and lines where changes should be made, if possible.
The orientation should be correct with the model's base on the plane created by the X and Z axis.
How can we reproduce this? What steps did you do to trigger the problem? If this is an issue with processing a dataset, YOU MUST include a copy of your dataset AND task output log, uploaded on Google Drive or Dropbox (otherwise we cannot reproduce this).
Here is the dataset for the first example (100 images) and here is the dataset for the third example (75 images).
Here are the options I used:
auto-boundary: true, camera-lens: equirectangular, sky-removal: true
I'm not sure where the task output log is stored so I haven't uploaded it.