Closed Nimingez closed 1 week ago
Hi @Nimingez !
You can see an example of how to train a policy under examples/3_train_policy.py
. Make sure to collect a dataset first in 1_collect_demos.py
!
I couldn't find how to modify the robot's starting position and orientation, so I'm unable to collect the dataset.
Hi @Nimingez,
Line 32-42 of 1_collect_demos.py specifies demo collection design choices. For example, DEFAULT_DIST_FROM_HANDLE specifies the default starting position of the gripper with respect to the target link of the target object; BBOX_TRAIN_RANDOMIZATION specifies the relative % randomization scaling of target objects' bboxes; XYZ_RANDOMIZATION and Z_ROT_RANDOMIZATION specifies the position and z-orientation randomization of the target object.
If you want more randomization, you can change the aforementioned parameters. Be careful that our default robot (FrankaMounted) has a relatively limited reachability. If the randomization is too aggressive, or the initial gripper pose DEFAULT_DIST_FROM_HANDLE is too large, most scripted demonstrations might fail, causing slow demo collection.
I currently have the default position of the robotic arm below the ground. Changing XYZ_RANDOMIZATION's z initially places it above the ground, but it returns below the ground after a reset. robot translate -300 -300 -300 orient 0.119 -1.956 77.296 need how many demo?
The XYZ_RANDOMIZATION is the amount of translation randomization applied to the robot in between episodes. In your case, +/- 2m, which is pretty significant. The center point of the offset of the robot with respect to the articulated object handle is DEFAULT_DIST_FROM_HANDLE
-- is that what you want to modify?
Hi @Nimingez
For the number of demos, we collect 10k demos for all experiments presented in our paper. However, I do believe for the door openning task, 1k demos may also work well.
I want to open a door, but the acdc_output/step_2_output/door_0/cousin_results.json "articulated": false How can i change this? { "metadata": { "n_cousins": 2, "n_objects": 1 }, "objects": { "door_0": { "articulated": false, "cousins": [ { "category": "door", "model": "uatmde", "ori_offset": null, "z_angle": 1.5707963267948966, "snapshot": "/home/nmz/digital-cousins/assets/objects/door/model/uatmde/uatmde_75.png" }, { "category": "door", "model": "hzfirv", "ori_offset": null, "z_angle": -1.5707963267948966, "snapshot": "/home/nmz/digital-cousins/assets/objects/door/model/hzfirv/hzfirv_25.png" } ] } } }
Hi @Nimingez ,
I believe that should be fine. The articulated
entry is used to filter out irrelevant categories during the matching process. If articulated=False
, our pipeline will simply select from all available categories (including articulated ones). As you can see, the matched cousins are in fact door categories, which should be articulated. You can check concretely that those models are openable by running the following:
import omnigibson as og
from omnigibson.utils.asset_utils import get_all_object_category_models_with_abilities
og.launch()
openable_door_models = get_all_object_category_models_with_abilities("door", {"openable": {}})
print(openable_door_models)
Hi @Nimingez
For the number of demos, we collect 10k demos for all experiments presented in our paper. However, I do believe for the door openning task, 1k demos may also work well.
Hello, I run the following command
python 1_collect_demos.py \
--scene_path ../tests/acdc_output/step_3_output/scene_0/scene_0_info.json \
--target_obj cabinet_4 \
--target_link link_1 \
--cousins bottom_cabinet,bamfsz,link_1 bottom_cabinet_no_top,vdedzt,link_0 \
--dataset_path test_demos.hdf5 \
--n_demos_per_model 3 \
--eval_cousin_id 0 \
--seed 0
which only collects 6 demos, maybe I should set the "n_demos_per_model" as 10k?
@jingma-git If you want to collect in total 10k demos, and you have 2 digital cousins to collect demonstrations, you may set n_demos_per_model to 5k.
I found that the link of the door is 'leaf'.Is it not possible to run it directly? python 1_collect_demos.py --scene_path /home/xxx/digital-cousins/examples/images/acdc_output/step_3_output/scene_0/scene_0_info.json --target_obj door_0 --target_link leaf --dataset_path /home/xxx/digital-cousins/examples/images/acdc_output/test.hdf5
Hi @Nimingez ,
Does the above not run for you? As long as target_link
is a valid link name and has a handle attached to it, it should work!
Closing this issue for now. Please re-open if you continue to run into issues!
Thank you for your work. How should I train my robot tasks in the generated scene?