Closed dorklebop closed 6 years ago
universe-starter-agent only works with pixel (Box(Y, X, 1)
or Box(Y, X, 3)
) observations and Discrete(N)
actions. If your env satisfies that it should work. Just edit envs.py to create the environment and wrap it as needed.
If it's a non-vectorized (classic gym) environment you'll need to add the Vectorize
filter as is done for Atari envs here: https://github.com/openai/universe-starter-agent/blob/master/envs.py#L75
I have a custom gym environment, and I would like to try the starter agent on this env. Is this possible, and if so, which files should I edit (I am guessing that it should mainly be envs.py)? Are there any requirements the custom env should satisfy?