ray-project / ray

Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
https://ray.io
Apache License 2.0
32.97k stars 5.58k forks source link

ValueError: Registry value for rllib_model/carla doesn't exist. #4882

Closed Deepak3994 closed 3 years ago

Deepak3994 commented 5 years ago

Hi,

I am trying to run rollout script on the trained carla model for some scenario. But i am getting the error as "ValueError: Registry value for rllib_model/carla doesn't exist". May i know what actually the problem is and why i am getting this error and how to resolve.

Kindly help.

ericl commented 5 years ago

Did you register them in the script? https://github.com/ray-project/ray/blob/master/python/ray/rllib/rollout.py#L33

Deepak3994 commented 5 years ago

Yes @ericl

I have registered my environment and have given the id as Carla-v0. When i check for registered gym environments using the python lines "from gym import envs; print(envs.registry.all())", it shows my environment id which i have registered.

ericl commented 5 years ago

Well first, the environment registration process is different with RLlib, as documented here: https://ray.readthedocs.io/en/latest/rllib-env.html

But regardless of that, I meant registering the model, not the env. See the comment in the script I linked above.

Deepak3994 commented 5 years ago

@ericl

Yes, I have done both the model registration as "ModelCatalog.register_custom_model("carla", CarlaModel)" and environment registration as "registerenv("CarlaEnv", lambda : CarlaEnv)".

stale[bot] commented 3 years ago

Hi, I'm a bot from the Ray team :)

To help human contributors to focus on more relevant issues, I will automatically add the stale label to issues that have had no activity for more than 4 months.

If there is no further activity in the 14 days, the issue will be closed!

You can always ask for help on our discussion forum or Ray's public slack channel.

stale[bot] commented 3 years ago

Hi again! The issue will be closed because there has been no more activity in the 14 days since the last message.

Please feel free to reopen or open a new issue if you'd still like it to be addressed.

Again, you can always ask for help on our discussion forum or Ray's public slack channel.

Thanks again for opening the issue!

wangshuo1994 commented 3 years ago

@Deepak3994 Hi, have you solved the problem now? I am facing the same problem when I want to reproduce my RL training results through the checkpoint... Thank you.

Forbu commented 2 years ago

@wangshuo1994 Hi, I also have the same issue.

Forbu commented 2 years ago

Ok I just solve it ... The issue was that we have to make the register call in the ray session (between ray.init() and ray.shutdown())