MLJejuCamp2017 / DRL_based_SelfDrivingCarControl

Deep Reinforcement Learning (DQN) based Self Driving Car Control with Vehicle Simulator
293 stars 95 forks source link

Couldn't launch the jeju_camp environment. #5

Closed Zhousiyuhit closed 6 years ago

Zhousiyuhit commented 6 years ago

Hello, I encountered some problems that "couldn't launch the jeju_camp environment."

2018-05-08 12 36 56

Maybe you can tell me how to solve this problem? Thanks~

Kyushik commented 6 years ago

You should write the path of environment to env_name. In my case, environment path is ../environment/jeju_camp so I wrote the above path to env_name

Zhousiyuhit commented 6 years ago

Sorry, I don’s know the environment path is python path? Or Unity path? Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,12:46,Kyushik Min notifications@github.com 写道:

You should write the path of environment to env_name. In my case, environment path is ../environment/jeju_camp so I wrote the above path to env_name

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387282954, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNznN1eE79x6NMZ544QeZ_pKhxMdP9ks5twSMbgaJpZM4T2AVa.

Zhousiyuhit commented 6 years ago

Maybe I know it, it’s Unity path~ Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,12:46,Kyushik Min notifications@github.com 写道:

You should write the path of environment to env_name. In my case, environment path is ../environment/jeju_camp so I wrote the above path to env_name

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387282954, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNznN1eE79x6NMZ544QeZ_pKhxMdP9ks5twSMbgaJpZM4T2AVa.

Kyushik commented 6 years ago

Yeah~ you should unzip the environment file to certain path, and write the path of the file to env_path :)

Zhousiyuhit commented 6 years ago

Thanks very much, I have another question about the environment path~ I installed Unity on macOs, so how to find the unity environment to launch?

Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,12:51,Kyushik Min notifications@github.com 写道:

Yeah~ you should unzip the environment file to certain path, and write the path of the file to env_path :)

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387283594, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNztS-uLz58JGmU9PPRu3wg08HXZfzks5twSRQgaJpZM4T2AVa.

Kyushik commented 6 years ago

Oh! Actually, you don't need Unity to launch the environment file. Just unzip it into some folder and set the env_path and just launch the ipynb file. However, I didn't update environment for Mac and Linux. I will upload them today!

Zhousiyuhit commented 6 years ago

Ok, thanks~😁 Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,13:02,Kyushik Min notifications@github.com 写道:

Oh! Actually, you don't need Unity to launch the environment file. Just unzip it into some folder and set the env_path and just launch the ipynb file. However, I didn't update environment for Mac and Linux. I will upload them today!

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387285004, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNznfvzVW8Y007YMSWQe7D6W9XE_f6ks5twSbPgaJpZM4T2AVa.

Zhousiyuhit commented 6 years ago

I just reset the environment path, and there was another NameError that “name ‘UnityEnvironment’ is not defined.” So if you update environment for Mac and the problem can be solved? Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,13:02,Kyushik Min notifications@github.com 写道:

Oh! Actually, you don't need Unity to launch the environment file. Just unzip it into some folder and set the env_path and just launch the ipynb file. However, I didn't update environment for Mac and Linux. I will upload them today!

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387285004, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNznfvzVW8Y007YMSWQe7D6W9XE_f6ks5twSbPgaJpZM4T2AVa.

Kyushik commented 6 years ago

I updated it! :) Unzip the environment zip file into environment folder. And then run the ipynb file.

Zhousiyuhit commented 6 years ago

Ok, thanks a lot, I will try it now~ Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,15:20,Kyushik Min notifications@github.com 写道:

I updated it! :) Unzip the environment zip file into environment folder. And then run the ipynb file.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387309307, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNzkLqezLWHolUV6XzgDBgC3yCD-Ysks5twUdZgaJpZM4T2AVa.

Zhousiyuhit commented 6 years ago

Oh, I succeed in importing the environment model, but the model is the 3Dball when I learned examples for Unity, so how can I get the "self driving car” model? Thanks~ Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,15:20,Kyushik Min notifications@github.com 写道:

I updated it! :) Unzip the environment zip file into environment folder. And then run the ipynb file.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387309307, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNzkLqezLWHolUV6XzgDBgC3yCD-Ysks5twUdZgaJpZM4T2AVa.

Kyushik commented 6 years ago

Really? It's odd.. Did you set env_path as ../environment/jeju_camp?

Zhousiyuhit commented 6 years ago

Oh, I just set the env_name like this:

env_name = "/Users/siyuchou/Desktop/DRL_Lab/DRL_based_SelfDrivingCarControl/SiyuZhou_3Dball”

But I think the environment zip file as you said is the safe-car model? It’s right?

Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn

在 2018年5月8日,16:08,Kyushik Min notifications@github.com 写道:

Really? It's odd.. Did you set env_path as ../environment/jeju_camp?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MLJejuCamp2017/DRL_based_SelfDrivingCarControl/issues/5#issuecomment-387320899, or mute the thread https://github.com/notifications/unsubscribe-auth/AXZNzoR53gzLP7HvBZ8R202mi_-TRRmLks5twVKJgaJpZM4T2AVa.

Kyushik commented 6 years ago

Yes. The link of the environment.zip file is in the readme.md. Download it and unzip it! :)

Zhousiyuhit commented 6 years ago

Oh, the problem continued! env_name = "/Users/siyuchou/Desktop/DRL_Lab/DRL_based_SelfDrivingCarControl/environment/Contents/MacOS/jeju_camp" # Name of the Unity environment binary to launch train_mode = True # Whether to run the environment in training or inference mode It’s right?

And I run the jupyter notebook, the error can be shown as:


UnityEnvironmentException Traceback (most recent call last)

in () ----> 1 env = UnityEnvironment(file_name=env_name) 2 3 # Examine environment parameters 4 print(str(env)) 5 ~/Desktop/DRL_Lab/DRL_based_SelfDrivingCarControl/unityagents/environment.py in __init__(self, file_name, worker_id, base_port, curriculum, seed) 92 raise UnityEnvironmentException("Couldn't launch the {0} environment. " 93 "Provided filename does not match any environments." ---> 94 .format(true_filename)) 95 else: 96 logger.debug("This is the launch string {}".format(launch_string)) UnityEnvironmentException: Couldn't launch the jeju_camp environment. Provided filename does not match any environments. Maybe I made another mistake in some steps?? Siyu Zhou ————————————————————————— Reinforcement Learning | Motion Planning Harbin Institute of Technology School of Astronautics Phone: 86+18845027614 Email: hitaezsy@gmail.com | 17s118203@stu.hit.edu.cn > 在 2018年5月8日,16:12,Kyushik Min 写道: > > Yes. The link of the environment.zip file is in the readme.md. Download it and unzip it! :) > > — > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub , or mute the thread . >
Kyushik commented 6 years ago

Oh. Okay. I uploaded my github version to 0.8! Please erase the previous version and download new version. And there will be environment folder. Unzip the environment_mac.zip into the environment folder. Then run the ipynb file. I think this can solve your problem! :smile: