med-air / SurRoL

[IROS'21] SurRoL: An Open-source Reinforcement Learning Centered and dVRK Compatible Platform for Surgical Robot Learning
https://med-air.github.io/SurRoL/
MIT License
119 stars 19 forks source link

HOW to test training of environment #1

Closed Cladett closed 2 years ago

Cladett commented 2 years ago

Hello,

thank you for sharing your repo. I wanted to test it and replicate your results. I have downloaded and installed it. I run the test scripts and everything seems fine. I then checked inside /run where there are three bash script.

I tried to run ddpg_activetrack.sh with the following env id -env=ActiveTrack-v0.

I get the following error:

Traceback (most recent call last):

File "/home/claudia/.pyenv/versions/3.7.12/lib/python3.7/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/home/claudia/.pyenv/versions/3.7.12/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "/home/claudia/data/virtual_environments/surrol/lib/python3.7/site-packages/baselines-0.1.6-py3.7.egg/baselines/run.py", line 250, in main(sys.argv) File "/home/claudia/data/virtual_environments/surrol/lib/python3.7/site-packages/baselines-0.1.6-py3.7.egg/baselines/run.py", line 216, in main model, env = train(args, extra_args) File "/home/claudia/data/virtual_environments/surrol/lib/python3.7/site-packages/baselines-0.1.6-py3.7.egg/baselines/run.py", line 54, in train env_type, env_id = get_env_type(args) File "/home/claudia/data/virtual_environments/surrol/lib/python3.7/site-packages/baselines-0.1.6-py3.7.egg/baselines/run.py", line 143, in get_env_type assert env_type is not None, 'env_id {} is not recognized in env types'.format(env_id, _game_envs.keys()) AssertionError: env_id ActiveTrack-v0 is not recognized in env types

Is there a problem with the environment registration in gym?

Following the Issue #662 from Baslines i have add the following line in baselines/run.py at line 52: _game_envs['custom_type'] = { 'ActiveTrack-v0'}

But again i get the following error: gym.error.UnregisteredEnv: No registered env with id: ActiveTrack-v0

Thank you, Cheers Claudia

jiaqixuac commented 2 years ago

Hi @Cladett, thanks for the comments! If you want to use Baselines to run the experiments, the first step is to register custom environments in openai Gym. And the simplest way is to add one line of code in gym/gym/envs/init.py. import surrol.gym Please refer to register a custom environment and gym envs. I will update the README accordingly.

Thank you!

Cladett commented 2 years ago

Hi @jiaqixuac, thanks for the answer. I thought i was missing something with the registration. Thank you for specifying it.

Based on what you suggested, I have done a bit of reading and, at least for my set up, I prefer not to modify the gym repo code but i have made couple of changes into your repo that allowed me to correctly register all the environments.

So basically i just moved the init.py that you had inside the /surrol/gym outside in surrol. I slightly modified the registration of each envs as follows:

register( id='NeedleReach-v0',

entry_point='surrol.tasks.needle_reach:NeedleReach', (old line )

entry_point='surrol.tasks:NeedleReach', **(new line )**
max_episode_steps=50,

)

Then inside surrol/tasks i have add a new init.py with just:

from surrol.tasks.needle_reach import NeedleReach from surrol.tasks.needle_pick import NeedlePick from surrol.tasks.gauze_retrieve import GauzeRetrieve from surrol.tasks.peg_transfer import PegTransfer from surrol.tasks.needle_regrasp_bimanual import NeedleRegrasp from surrol.tasks.peg_transfer_bimanual import BiPegTransfer from surrol.tasks.ecm_reach import ECMReach from surrol.tasks.ecm_misorient import MisOrient from surrol.tasks.ecm_static_track import StaticTrack from surrol.tasks.ecm_active_track import ActiveTrack

Like this everything is structured as they suggest on gym to create a custom env and besides your repo nothing else needs to be modified.

Also add a couple more dependencies on the setup.py which i needed it, while testing this on a new virtualenv:

from setuptools import setup

if name == 'main': setup( name='surrol', version='0.1.0', description='SurRoL: An Open-source Reinforcement Learning Centered and ' 'dVRK Compatible Platform for Surgical Robot Learning', author='Med-AIR@CUHK', keywords='simulation, medical robotics, dVRK, reinforcement learning', packages=[ 'surrol', 'surrol.tasks', ], install_requires=[ "gym>=0.15.6", "pybullet>=3.0.7", "numpy>=1.21.1", "scipy", "pandas", "imageio", "imageio-ffmpeg", "opencv-python", "roboticstoolbox-python", "imageio", "sympy", "cvxopt", ], extras_require={

optional dependencies, required by evaluation, test, etc.

        "all": [
            "tensorflow-gpu==1.14",
            "baselines",
            "mpi4py",  # important for ddpg
            "ipython",
            "jupyter",
        ]
    }
)