-
Thanks for your great work!I run your code in some dataset,and have some question.
1.About FPS mentioned in your main paper.I run your code in A40 Server,in replica dataset,about 2000 frames.The ti…
-
hello !
I'm using a jetson production module running on Ubuntu 18.04!
and I can't install Open CV 4.5.1.
I have already done and sudo apt-get update, and sudo apt-get upgrade.
this is the output…
-
@christiankerl
I have tried dvo_slam from: https://github.com/songuke/dvo_slam. I compiled the dvo_slam package in ROS Indigo on Ubuntu 14.04 successfully.
Used TUM RGB-D benchmark, I can see righ…
-
Hello, I successfully used your implementation to train the 360 images I collected. The entire pipeline is as follows:
Insta360 Capture -> opensfm compute 360 pose -> Perspective-and-Equirectangula…
-
Hi,
I would like to connect my OAK-D camera to rtabmap as RGBD camera but I am not able to make the configuration. The problem is defined in terminal as:
```
[ WARN] [1624989827.670520212]: odomet…
-
_From @lincolnfrog on September 20, 2018 20:47_
This is important if we want to support a safer version of AR that people will use on less-trusted areas of the web. Somehow we need to restrict what i…
-
-
Hi, thank you very much for your great work first.
I used D435i to build a TUM_ Similar to RGBD dataset, GT uses colmap output pose and modifies freiburg1_desk.yaml successfully ran my own dataset, b…
-
Hey, thanks for the great work!
I would like to ask how you get the point cloud for initialization when training on MatrixCity data? Or could you provide the ply file from SfM for the dataset?
Thank…
-
I know there were already several requests related to the possible support of Tango Devices with ARCore. My question is based on the fact, that we want to use the already available depth sensors to ca…