ucsdarclab / dVRL

Contains Reinforcement Learning environments for the da Vinci Surgical System
54 stars 11 forks source link

About register env to gym #6

Closed ljjTYJR closed 4 years ago

ljjTYJR commented 4 years ago

Hello, I have changed the PsmEnv.py to run on my local computer, but I run the code gym.make() And I succeed to do do this;

However, when I want to use OpenAI/baselines to train the model, I get this mistake:

raise error.UnregisteredEnv('No registered env with id: {}'.format(id)) gym.error.UnregisteredEnv: No registered env with id: dVRLReach-v0

Since I construct the project in pycharm, when I run the code in pycharm: from gym import envs print(envs.registry.all())

It shows:

......, EnvSpec(dVRLReach-v0), EnvSpec(dVRLPick-v0)]) thses two envs are included;

But when I run the code above in command lines: (RLBench) PS>python Python 3.6.10 |Anaconda, Inc.| (default, Mar 23 2020, 17:58:33) [MSC v.1916 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> from gym import envs >>> print(envs.registry.all())

It shows that the two envs are not included in gym envs;

Then I run init.py :I get the message:

File "c:\users\administrator\gym\gym\envs\registration.py", line 132, in register raise error.Error('Cannot re-register id: {}'.format(id)) gym.error.Error: Cannot re-register id: dVRLReach-v0

But I add the code in the init.pyif 'dVRLReach-v0' in gym.envs.registry.all(): print('add in the dict')

It does not print the 'add in the dict'

Could anyone can give me some hints?

bango123 commented 4 years ago

The only thing I can think of is make sure you add import dVRL_simulator.... As that is what runs the init.py files

ljjTYJR commented 4 years ago

If I want to register my env to gym, do I need to add register code to init.py of gym?

ljjTYJR commented 4 years ago

I solve this problem by using python to run '*.py' file and get the arguments from command lines

bango123 commented 4 years ago

Just to add on, yes you need to make sure the register command runs somehow. I put them in the __init___ file so the environments are registered when the package is imported

nndei commented 4 years ago

Dear ljjTYJR and bango123,

I am a MacBook Pro user, who cannot run these environments in docker containers (if I have understood correctly), because for that, bango123 uses the --runtime=nvidia command and I have an Intel integrated and an AMD dedicated GPU. From what I have understood, the aforementioned command needs CUDA drivers and the NVIDIA driver installed and there's no compatibility with Intel and AMD. Please correct me if I'm wrong.

Therefore, I have understood I shall avoid using the docker containers and so I will not be able to run multiple containers for training. Am I right so far?

I am having difficulties running everything with these incompatibilities of mine (e.g. having a MacBook with no NVIDIA GPU).

Having a new MB means that I can run Ubuntu, but with many inconveniences like not having working Wi-Fi, which is a reason for me asking what can I do both in macOS and Ubuntu (if macOS is really unusable).

Therefore, I would like to ask you:

Thank you! Best, Neri

ljjTYJR commented 4 years ago

Hello nndei, from the file vrep.py: there are codes as below: libsimx = None try: file_extension = '.so' if platform.system() =='cli': file_extension = '.dll' elif platform.system() =='Windows': file_extension = '.dll' elif platform.system() == 'Darwin': file_extension = '.dylib' else: file_extension = '.so' libfullpath = os.path.join(os.path.dirname(__file__), 'remoteApi' + file_extension) libsimx = ct.CDLL(libfullpath) except: print ('----------------------------------------------------') print ('The remoteApi library could not be loaded. Make sure') print ('it is located in the same folder as "vrep.py", or') print ('appropriately adjust the file "vrep.py"') print ('----------------------------------------------------') print ('') So, i think if you want to run it on macbook, you need to replace .so to .dylib; If you download vrep of macos version, the file should be included in the directory; (And be careful that vrep changes name to Coppelia Robotics)

I didn't run it in docker just because I don't know how to send images from docker to my windows.

nndei commented 4 years ago

Thank you! I read that code in the vrep.py file. So I will try that extension!

For clarity purpose, let me enumerate my doubts:

  1. So, I need to make sure that the downloaded vrep folder is in the same directory of the remoteAPI.dylib file? I downloaded vrep 3.6.2 Pro Edu, not CoppeliaSim.

  2. About the last thing you said: so docker for us is not useful, right? Also the GUI shouldn't be important because it is necessary for docker to persist data on the host?

  3. To change the environment I need to edit the .ttt file, right? Do I need to run any of those .py files or are they called by the .ttt file and therefore I just need to open the .ttt file?

Best regards, Neri

ljjTYJR commented 4 years ago
  1. remoteAPI.dylib should be included in the directory of vrep when you download it, you need to copy the remoteAPI.dylib to the directory of dVRL directory(dVRL-dVRL_simulator-vrep)
  2. if you want to train multi-instances at the same time, you need to use docker; Yeah, I think if you just want to get the data and do not need GUI, the GUI is not important;
  3. No, vrep is cross-platform, you do not need to edit this file. The .ttt file saves the structure of the robot, you can consider it as a robot simulator, and the .py files in this repo also just define some interfaces to the robot. For this, I recommend you read some basics of gym:https://gym.openai.com/;
nndei commented 4 years ago

Thank you, again.

So for number 1. I have found the file and will copy it there. There's also a remoteAPI whole folder, but I guess it is not important.

For 2. ok, I get it. Do you know if the cuda situation is compatible only with nvidia gpus? I have as a I said an Intel and AMD. It's unlucky not to have access to multi-instances.

For 3., ok! So I will leave it as is. I was wondering because I was told to setup everything and then modify the environment, that's why I was wondering which file that would be for.

I'll check gym :)

nndei commented 4 years ago

I am left with another last doubt, which I'd like to call 4.

After modifying PsmEnv.py with commenting the lines and adding the local_ip variable, how do I check if everything's working? I don't get bango123's reply in issue #3

A new instance of the PSM environments should now link with the your opened *.ttt file!

Best regards, Neri

ljjTYJR commented 4 years ago

Then you open the file dVRK-oneArm-reach.ttt with vrep, when you run the .py file above, if you connect to vrep and there is no error, it's OK.

nndei commented 4 years ago

Thank you for the reply, I have trouble with dVRL_simulator. Once I run the python file, I get:

ModuleNotFoundError:` No module named 'dVRL_simulator'

ljjTYJR commented 4 years ago

Do you create the .py file under the directory of dVRL?

nndei commented 4 years ago

I was in the environments folder, inside dVRL folder. Now that I have moved it from there to simply dVRL, it has ran.

However, these errors in the output: Traceback (most recent call last): File "prova.py", line 3, in <module> env = gym.make('dVRLReach-v0') File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 142, in make return registry.make(id, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 87, in make env = spec.make(**kwargs) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 59, in make env = cls(**_kwargs) File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/environments/reach.py", line 9, in __init__ super(PSMReachEnv, self).__init__(psm_num = psm_num, n_substeps = 1, block_gripper = True, File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/PsmEnv_Position.py", line 76, in __init__ super(PSMEnv_Position, self).__init__(psm_num = psm_num, n_substeps=n_substeps, n_states = self.n_states, File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/PsmEnv.py", line 87, in __init__ raise IOError('V-Rep failed to load!') OSError: V-Rep failed to load!

Note: prova.py is the file you above told me to create, I named it prova (in Italian: try) Do I miss commenting something out? I did lines 65-69 and 177 in PsmEnv.py and also added the variable with self.container_ip="127.0.0.1"

Thank you, Neri

ljjTYJR commented 4 years ago

From the info above, It seems that it does not connect to vrep; So I think you need to open vrep first and re-run the file;

Also, Here are other steps I made:

  1. comment out lines 48-68 in PsmENV.py; comment out line 177;
  2. In PSMEnv_Position.py, I comment out the variable(docker_container) in _def:_init__
  3. I did the same thing( step2) in pick.py and reach.py
nndei commented 4 years ago

Thank you again for the support.

I was already running V-REP and had opened the _dVRK_oneArmReach.ttt file as previously indicated. I also tried putting V-REP downloaded folder in the dVRL folder. I commented out everything as you said, making sure to correctly close the )

Here's the output:

Traceback (most recent call last):
  File "prova.py", line 3, in <module>
    env = gym.make('dVRLReach-v0')
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 142, in make
    return registry.make(id, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 87, in make
    env = spec.make(**kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/envs/registration.py", line 59, in make
    env = cls(**_kwargs)
  File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/environments/reach.py", line 9, in __init__
    super(PSMReachEnv, self).__init__(psm_num = psm_num, n_substeps = 1, block_gripper = True,
  File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/PsmEnv_Position.py", line 77, in __init__
    super(PSMEnv_Position, self).__init__(psm_num = psm_num, n_substeps=n_substeps, n_states = self.n_states, 
  File "/Users/neri/Desktop/Tesi_Magistrale/dVRL/dVRL_simulator/PsmEnv.py", line 87, in __init__
    raise IOError('V-Rep failed to load!')
OSError: V-Rep failed to load!
bango123 commented 4 years ago

So this io error comes up because the python code did not link with the v-rep scene. I just tested a little bit, and I believe the v-rep scene needs to open and running to connect. So make sure you hit the "play" button in the v-rep GUI before running the code. Let me know if that solves your issue

BTW, @ljjTYJR thank you for all your help. If you seem to have come up with simple steps/modifications to run the environments on other systems I would appreciate a pull request :) No pressure of course!

nndei commented 4 years ago

Thank you for helping too, bango123. I took your advice and now the code outputs:

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/gym/logger.py:30: UserWarning: WARN: Box bound precision lowered by casting to float32
  warnings.warn(colorize('%s: %s'%('WARN', msg % args), 'yellow'))

I'm not sure what it means, but I believe its minor. What do you think?

I would be happy to help enriching the guide for other OSs and hardware setups too, but I don't know if I have enough knowledge to cover every detail without saying something wrong.

ljjTYJR commented 4 years ago

cool, @bango123 , but do I need to upload the new file or just update ReadMe

bango123 commented 4 years ago

I think updated the readme would be sufficient!