jr-robotics / robo-gym

An open source toolkit for Distributed Deep Reinforcement Learning on real and simulated robots.
https://sites.google.com/view/robo-gym
MIT License
390 stars 74 forks source link

speed up the training or launch the same env simultaneously #24

Closed ChenyangRan closed 3 years ago

ChenyangRan commented 3 years ago

Hi, I'm using the 'EndEffectorPositioningUR10Sim-v0', which is from robo_gym_v1.8.0 with ros-kinetic. It works well, but training an epoch is too time-consuming, is there a way to accelerate the environment or launch the same env simultaneously? I have tried launch the ur_env simultaneously by 'start-server-manager' twice,and a port is opened. Although the algorithm is still in progress, I don’t know if the two environments are working in parallel. 图片

Ubunt 16.04 Ros kinetic Gazebo 9.16

matteolucchi commented 3 years ago

Hi @ChenyangRan and thank you for using robo-gym!

You can start the Server Manager once and then call env.make() multiple times , with the algorithm that we are using right now we have multiple workers running in parallel and each worker is calling env.make() and the Server Manager spawns a new instance of the env. On a powerful enough machine we run up to 20 workers/environments in parallel.

Right now we are able to run simulations only up to real time, but we have in our plans to look up solutions to speed up training.

I am not sure if this answers your question, please let me know if you need further input :)

Matteo

ChenyangRan commented 3 years ago

Hi @ChenyangRan and thank you for using robo-gym!

You can start the Server Manager once and then call env.make() multiple times , with the algorithm that we are using right now we have multiple workers running in parallel and each worker is calling env.make() and the Server Manager spawns a new instance of the env. On a powerful enough machine we run up to 20 workers/environments in parallel.

Right now we are able to run simulations only up to real time, but we have in our plans to look up solutions to speed up training.

I am not sure if this answers your question, please let me know if you need further input :)

Matteo

Thanks, maybe I can launch difference parameters simultaneously to speed :)

matteolucchi commented 3 years ago

Yes, that could be an option. I will close this one for now, feel free to reopen if you need further info.