-
(gridenv) root@office2-B85M-D2V-SI:/home/office2/Jd_desk/RL_shortest_path/gym-novel-gridworlds/tests# python train.py
/home/office2/Jd_desk/RL_shortest_path/gym_grid_1/gridenv/lib/python3.7/site-pack…
-
In only external mode with visual observation input, a texture image added to a material disappeared (it becomes black at runtime) in the image treated by the CNN.
Steps to reproduce the behavior:
…
-
Hi!
I'm working on recreating some basic RL policies, and trying to use the common interface of Environment and it looks like RLBase.reward is not implemented for EmptyRoom.
I can open a pul…
-
### Subject of the issue
I'm trying to upgrade from MDX v1 to v2. Inline equations such as `$x < 3$` used to be correctly understood as a LaTeX equation in my old setup but stopped working with v…
-
**Describe the bug**
When using wrapping a Unity ML Agents environment in a Gym environment, the gym_unity.envs.UnityEnv class will create an incorrect Box observation space for visual observations w…
-
I have an environment that is similar to gridworld, there is a grid of 4x4, the agent is the yellow square 🟨, the green cells are the ones that the agent has visited, and the red ones are the cells th…
-
I think it would be good to have a processing benchmark for mlagents. We could then try to improve it through driver/cuda/unity flags and code optimization.
An initial benchmark could be the GridWo…
-
Many game developers use unity as 2D and they would like to add ML-agents to their projects however all the example environments are in 3D.
Moreover there are some cases that one more dimension do…
-
Hi,
For a testing purpose, I'd like to slow down the number of actions I make per second in inference mode with gridworld. I have set a decision frequency of 1 and my timescale is 1 in inference. T…
-
Hi all,
I'm using Gridworld as an example to make my own environment. I've got a question about the Agent's visual observation input. The agent as a gameObject receives a Render Texture Component (…