kevslinger / DTQN

Deep Transformer Q-Networks for Partially Observable Reinforcement Learning
MIT License
136 stars 21 forks source link

Getting an error stating environment D does not exist #5

Closed AvisP closed 11 months ago

AvisP commented 11 months ago

I tried to run the basic version of the code python run.py without installing any of the additional packages and getting this error

WARNING: ``gym_gridverse`` is not installed. This means you cannot run an experiment with the `gv_*` domains.
WARNING: ``gym_gridverse`` is not installed. This means you cannot run an experiment with the gv_*.yaml domains.
WARNING: ``gym_pomdps`` is not installed. This means you cannot run an experiment with the HeavenHell or Hallway domain. 
WARNING: ``mini_hack`` is not installed. This means you cannot run an experiment with any of the MH- domains.
Loading using gym.make
Environment with id D not found.
Loading using YAML
Traceback (most recent call last):
  File "/...../DTQN-main/utils/env_processing.py", line 34, in make_env
    env = gym.make(id_or_path)
  File "/......./lib/python3.10/site-packages/gym/envs/registration.py", line 569, in make
    _check_version_exists(ns, name, version)
  File "/......./lib/python3.10/site-packages/gym/envs/registration.py", line 219, in _check_version_exists
    _check_name_exists(ns, name)
  File "/....../lib/python3.10/site-packages/gym/envs/registration.py", line 197, in _check_name_exists
    raise error.NameNotFound(
gym.error.NameNotFound: Environment D doesn't exist. 

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/....../DTQN-main/run.py", line 533, in <module>
    run_experiment(get_args())
  File "/......./DTQN-main/run.py", line 415, in run_experiment
    envs.append(env_processing.make_env(env_str))
  File "/....../DTQN-main/utils/env_processing.py", line 39, in make_env
    inner_env = factory_env_from_yaml(
NameError: name 'factory_env_from_yaml' is not defined

But then I did installation of gridverse and ran python3 run.py --envs DiscreteCarFlags-v0 --device mps and then it ran successfully.

kevslinger commented 11 months ago

Hi, thanks for your interest in DTQN! I recommend using the paper branch of this repo if you want more stable results that reflect the architecture mentioned in my original Arxiv paper. The main branch is more experimental -- most recently I was trying to test the agent on a variable number of environments, which I think caused the issue you ran into (this line). Thanks for the heads up