-
trying to hook it up to cartpole..
```py
print(self.env.reset())
(pid=2462395) [ 0.01369617 -0.02302133 -0.04590265 -0.04834723]
```
but it expects:
```py
obs, info = self.env.reset()
```
ie …
-
CartPole has this code which only sets reward = 0 if the previous state was terminal:
https://github.com/RobertTLange/gymnax/blob/aef77d5c642ea48b95f34c51d05b8417d9450e15/gymnax/environments/classic_…
-
Hi,
Big fan of this project! I'm trying to train an RL agent on a bunch of large environments at once, and I'm seeing an issue where some linkages are static/immobile when they shouldn't be. Here a…
-
### Describe the bug
When a DirectRLEnv is truncated by having a max_episode_length property set in gym.register,
Isaac Lab will crash when a truncate happens. The root cause being a single "Tr…
-
大佬好,
我这边使用的isaacsim 4.1.0版本,按照安装流程走下来,在运行omniisaacgymenvs的 cartpole demo时,PYTHON_PATH scripts/rlgames_train.py task=Cartpole ,运行后情况如下:
![image](https://github.com/user-attachments/assets/fbd739d2-89…
-
### 📚 Documentation
I obtained optimal hyperparameters for training CartPole-v1 from [RLZoo3][1]. I have created a minimal example demonstrating the performance of my CartPole agent. As oer the off…
-
### Question
Hi, I am unable to find the xml or urdf files for the assets, for example cube, table etc. I see that the urdf files are located in a path which involves variables such as `NUCLEUS_ASS…
-
### Bug Description
When using two tiled cameras in the scene, the last camera overrides all observations, resulting in both cameras capturing the same image.
### Steps to reproduce
1. Run th…
-
#### Issue Description
I am running Cartpole example but troubled with a problem and I have run rhe gym_http_server.py.
十月 10, 2019 11:13:36 上午 org.apache.http.impl.execchain.RetryExec execute
信息…
-
Yes, ideally we re-create some of the OpenAI Gym continuous control tasks,
starting with CartPole, InvertedPendulum-v1, keeping the interface very similar, on top of pybullet. We likely need to extend…