-
### Describe the bug
Error in "Training an agent with SKRL on Isaac-Reach-Franka-v0"
In Orbit Documentation, I did 'Getting Started >> Running existing scripts >> Reinforcement Learning >> Train…
-
Hi,
I have been trying to use the implicit actuator with velocity control in the devel branch using my own robot model. I used it in the main branch and it worked pretty well. However, I saw some iss…
-
I am trying to import a tracked robot as shown below in orbit, however I do not find related sources to support that. Now I am trying to using multi wheels to replace it but still would like to see if…
-
### Question
While trying to run the rsl_rl with anymal-c under velocity environment,
``` ./orbit.sh -p source/standalone/workflows/rsl_rl/train.py --task Isaac-Velocity-Anymal-C-v0 --headless ``…
-
You can play with the space & retrieval models here: https://b3246e5ab28482f60e.gradio.live - Not all models & indices are cached yet so some first runs may be slow but once cached it should be blazin…
-
Hello, and thank you for taking the time to read my post.
> Context:
A very recent Arch Linux installation (less than a week old);
Access to Arch Linux packages via pacman, stable and functional;…
-
### Question
In legged_robot.py:
` # TODO: contact forces -- Waiting for contact sensors in IsaacSim.`
` # For now, use heuristics for flat terrain to say feet are in contact.`
` # air times`…
-
### Question
I am trying to create an environment where I need to add two cameras to the Franka Panda Hand, one looking at the target from the left and the other from the right. I tried to achieve th…
-
### ❓ Question
Hi, I am wondering if recently or lately, will the developers include multi-agent reinforcement learning (MARL) model? If yes, will they be adapted to robotics platforms such as Omnive…
-
Hi, I got the following warnings when evaluating the "open drawer" task, and results are also worse compared to the original paper:
`test: 25.21/31.09
; object: 1.94/6.45
; scene: 11.76/20.78
; s…