-
It looks like the gym_futures packages aren't available via pip. How would I go about running this? Does it have to be compiled alongside OpenAI gym and manually adding the futures_env.py as a custom …
-
Hello dear Ray Team,
I hope you are doing well. I have been using Ray API ( and more specifically RLlib & Tune) for conducting Reinforcement Learning experiments with the various RL algorithms RLli…
-
Hi, would it be possible for gym-softrobot to be upgraded from gym to gymnasium? Gymnasium is the maintained version of openai gym and is compatible with current RL training libraries ([rllib](https:/…
-
### 🚀 Feature
It would be nice to have a wrapper that ingested gymnasium.vector.VectorEnv and gave back a VecEnv.
### Motivation
I want to do highly parallelized hardware accelerated simulation. Th…
-
# Driving Up A Mountain - A Random Walk
A while back, I found OpenAI’s Gym environments and immediately wanted to try to solve one of their environments. I didn’t really know what I was doing at the …
-
Hi, would it be possible for robo-gym to be upgraded from gym to gymnasium? Gymnasium is the maintained version of openai gym and is compatible with current RL training libraries ([rllib](https://gith…
-
Would be interesting to be able to run Atari against the [OpenAI gym](https://github.com/openai/gym) environment. Currently OpenAI is in python, maybe would be possible to have a [pytorch](https://gi…
-
use Dict or Tuple, the code is [here](https://github.com/openai/gym/blob/master/gym/spaces/dict_space.py).
_Originally posted by @yangpeiren in https://github.com/openai/gym/issues/59…
-
I'm trying to port an OpenAI Gym Environment and use coach for the learn on top.
The tutorial currently reads (emphasis mine):
```
Adding an Environment
Adding your custom environments to…
-
Gym API supports MultiDiscrete action spaces:
https://github.com/openai/gym/blob/master/gym/spaces/multi_discrete.py
This is useful when you want to discretize a continuous control problem, a tec…