Closed Eriz11 closed 5 years ago
The env_params
input to the graph manager should be an EnvironmentParameters
object. In this case, it should be a GymEnvironmentParameters
object and not a GymEnvironment
object.
If you would like to access the environment internals, you may want to try using CoachInterface
as demonstrated in the "Agent Functionality" subsection of the Getting Started Tutorial.
Does this answer your question?
Hey galnov,
Many thanks for taking the time to answer.
I finally (this morning) did resolve the problem using the GymVectorEnvironment
, which inherits from GymEnvironmentParameters
. So, yes; answered. However, I think that the use of the different classes held in the gym_environment.py
module should be more clear in the docs (as me, I was using the GymEnvironment
one because it is well documented in the source code and seems the one that permits more tweaking, to then find that it is not the right one to use directly). I'm open to make it happen on my side and add my two cents.
Also, regarding accessing the environment internals, it is a bit convoluted to do so. I finally also get it to work with this workaround: graphManager.environments[0].env
. This way, yo get to access the env object and from there anything can be done. Here I don't know if making it more accesible with in the preset-like structure would make sense. Anyway, also open to consider a give a thought about it if it helps.
Best,
Will close this issue as it is resolved.
Hi all,
Firstly, the library looks awesome in terms of the structural design and the learning curve is less sharp than what I expected. Congrats for all the work done.
NOTE: As I will be using a custom env, I tried the installation in an Ubuntu 18.04 and Python 3.7 machine, and the CartPole experiments I did worked perfectly. Just to add my two cents here.
I have a custom Gym-like environment, which loads the data from a .csv and each step is one row from that .csv. The custom Gym env is installed with the following path:
_gym-tradeZM/gymtradeZM/envs/tradeEnvScv0.py:ZMTradeEnvv0
I'm using the
GymEnvironment
class to get ahead with this, but the issue is the following:"AttributeError: 'GymEnvironment' object has no attribute 'path'" when running the
GraphManager
. I understand the error because it is true that I don't find anypath
attribute int he GymEnvironment object.The idea behind using the
GymEnvironment
class is that I can access my custom environment directly, so that I can save some summary data after the training. If I use theGymVectorEnvironment
or theGymEnvironmentParameters
classes I cannot then access my custom env object (¿can I in another manner?).¿Is there something I'm missing related to loading a custom gym environment that reads the data internally from a .csv? The observation space is just (27,) and the action space is Discrete(3). Any help is much appreciated to further debug my error.
Full preset code:
Create the path for the env.
Full stack error:
Traceback (most recent call last): File "/home/zmlaptop/Desktop/RLFrameworks/coachRLProject/runCoachModel.py", line 107, in
graphManager.improve()
File "/home/zmlaptop/miniconda3/envs/coach_rl/lib/python3.7/site-packages/rl_coach/graph_managers/graph_manager.py", line 531, in improve
self.verify_graph_was_created()
File "/home/zmlaptop/miniconda3/envs/coach_rl/lib/python3.7/site-packages/rl_coach/graph_managers/graph_manager.py", line 658, in verify_graph_was_created
self.create_graph()
File "/home/zmlaptop/miniconda3/envs/coach_rl/lib/python3.7/site-packages/rl_coach/graph_managers/graph_manager.py", line 146, in create_graph
self.level_managers, self.environments = self._create_graph(task_parameters)
File "/home/zmlaptop/miniconda3/envs/coach_rl/lib/python3.7/site-packages/rl_coach/graph_managers/basic_rl_graph_manager.py", line 62, in _create_graph
env = short_dynamic_import(self.env_params.path)(**self.env_params.dict,
AttributeError: 'GymEnvironment' object has no attribute 'path'