-
Hello, if I have a well-defined geometry, multiple images, and the corresponding camera poses for the images, how can I skip the MVE part and directly use MVS Texturing?
-
I am trying to append the current camera view to a trajectory class
```
import numpy as np
import open3d as o3d
trajectory = o3d.camera.PinholeCameraTrajectory()
def key_callback(vis):…
HM102 updated
2 years ago
-
I am currently using Intel RealSense T265 (with realsense-ros wrapper) with VINS-Mono.
When I started vins-estimator and moved the camera forward by 10cm, it could be seen in vins-rviz that the cam…
-
### Description
A recent incident has shown that we need to ensure our generated weights don't lead to excess of the maximum block weight or PoV size. Therefore, we need a CI job which checks that …
-
We tested this algorithm on the images we collected ourselves and found that the results were very poor. Our calibration board is printed directly, and our camera FOV is approximately 120 degrees. The…
-
Hello,
Is there any way to access the calibration matrix between IMU and lidar from livox sdk or driver? And is there any way to know that the calibration is disturbed?
Thank you
-
I have not checked all of these warnings, but I see that LVT and WVT are correctly flagged: they appear in the denominator of expressions:
DVT_long = - AVT * `MAXA(ln(Leff / LVT),0.0,1.0e-2);
D…
-
Thanks for the really cool work and the dataset.
I am just wondering, does that released dataset contains camera parameters(extrinsic and intrinsic)?
Thank you!
-
Is there a way to run blender/nerfstudio datasets with the current repo?
I have data with images and pointcloud.ply files, and a transforms.json.
I am having a hard time implementing a custom load…
-
### Checklist
- [X] I have searched for [similar issues](https://github.com/isl-org/Open3D/issues).
- [X] For Python issues, I have tested with the [latest development wheel](http://www.open3d.org…