EKostinoudis / Carla_Autonomous_Driving

Autonomous driving agent in Carla simulator leveraging IL and RL techniques.
MIT License
2 stars 2 forks source link

Evaluation step? #1

Open donghaiwang opened 1 month ago

donghaiwang commented 1 month ago

Thank you for you sharing. After I download the trained model, how to evaluate its performance and submit to leaderboard?

EKostinoudis commented 1 month ago

Hello, unfortunately I don't have the code for the evaluation on the Leaderboard. If you want to run the model you should:

  1. Install the Leaderboard and the corresponding scenario runner (if I remember correctly, if you want to use the Leaderboard 1.0 with the newer versions of Carla you must get a missing file from the old versions) (also, if I am remember again correctly, the Camera sensor settings are different in the Leaderboard 2.0).
  2. Create the AutonomousAgent class (as described here. (similar to the CILv2_RLModule.py)

If you have any more questions, feel free to reply.

donghaiwang commented 1 month ago

Thank you for your detailed replay. I have two questions about the running step.

  1. After extracting _results.tar.gz, how to use this trained model to test the effect in Carla scene?

  2. When I use the command to traing PPO agent, how to view the effect during and after training? RAY_DEDUP_LOGS=0 PYTHONPATH=. python3 train/train_ppo.py

EKostinoudis commented 3 weeks ago

Thank you for your detailed replay. I have two questions about the running step.

1. After extracting `_results.tar.gz`, how to use this trained model to test the effect in Carla scene?

2. When I use the command to traing PPO agent, how to view the effect during and after training?
   `RAY_DEDUP_LOGS=0 PYTHONPATH=. python3 train/train_ppo.py `
  1. You can use the CILv2_agent.py of this repo, or take a look at the CILv2_multiview repo.
  2. You can observe the Carla server window, or you can display the camera (the center or all 3) of the actor.