-
I found for some environments like PickSingleYCB, the evaluation protocol in the documents shows that there are not only 5 episodes per object in the training set (74 YCB objects) but also 10 episodes…
-
Hello.
I have generated data using this command:
`python single_video_pybullet.py --nb_frames 10000 --scale 0.001 --path_single_obj ~/Deep_Object_Pose/scripts/nvisii_data_gen/models/Gear/google_16k…
-
ルールブックによれば,Tidy Up Here Task 2bではオペレータから飲み物を指定されるという記述があったかと思います.
一方で,Known objectは全てYCB datasetにあるものが用いられるということですが,飲み物のデータが存在しません.
当該タスクではどのような物体が用いられるのでしょうか.
具体的には,YCBデータセット内の物体(飲み物でない物体)が用いられる…
-
Hey, I want to use your model on real live streaming with the YCB dataset.
I have already managed to run it live on my Intel RealSense Camera.
Just to be sure it will work well on my camera:
1. I…
-
您好,我想请问YCB文件夹下的NormalizationParameters以及CatPose2InsPose.npy如何生成呢?如何应用在我自己的物体上,我没有找到相关能参考的信息,非常期待以及感谢您的回复~
-
Hi Kevin,
in my pull request https://github.com/kevinzakka/obj2mjcf/pull/12 you asked about other features I could find useful for this library.
I thought about creating kind of a feature-request …
-
Hi, @ethnhe,this is really a nice work.When I tested with the code you provided, I achieved excellent results on the YCB dataset.But when I loaded the trained model and tested it on my another code, t…
-
Hello.
I would like to refer to the grasps that you have associated with the object, so I am processing object_taxonomies.npy.
This npy include 58x33 matrix. Maybe I think 58 is the number of obje…
-
I am a student at MIT in Russ Tedrake's Robotic Manipulation course. I have been able to successfully run one of his scripts (http://manipulation.csail.mit.edu/manipulation/clutter_maskrcnn_data.py) t…
-
## Habitat-Lab and Habitat-Sim versions
Habitat-Lab: 0.3.0
Habitat-Sim: 0.3.0
## ❓ Questions and Help
I'm trying to run the baseline regarding the Social Navigation task and I have some proble…