DaRL-LibSignal / LibSignal

102 stars 20 forks source link

failed to run agent "ppo, ppo_pfrl, maddpg, mpgd" #15

Open luckywlj opened 8 months ago

luckywlj commented 8 months ago

Describe the bug A clear and concise description of what the bug is.

I modified the agent as "ppo", the world as "sumo", the network as "sumo4x4". The rest codes of the "run.py" remained unchanged. When I run "run.py", there is an error as follow: File "/home/xxx/traffic_signal_control/DaRL/LibSignal/trainer/tsc_trainer.py", line 90, in create_agents agent = Registry.mapping['model_mapping'][Registry.mapping['command_mapping']['setting'].param['agent']](self.world, 0) KeyError: 'ppo'

the similar errors occured in such agent "ppo_pfrl, maddpg, magd". I also test the other agents and they can operate normally. Can you solve the problem and answer me? Thanks a lot!

luckywlj commented 8 months ago

I rensigered PPOAgent into the init.py, but ppo failed to run. I checked the ppo.py and found that train() is empty. So I rewrite the train().In addition, II modifed it to ActorNet and CriticNet. Of course, there are errors. So, can you provide a complete ppo version?

yifan-dadada commented 6 months ago

Hello! I encountered the same issue while using LibSignal. May I ask if you have found a solution?

image

I checked, and the issue is due to the agent not being imported in the init. However, even after making the modification, the problem still persists.

image

xingxindrst commented 2 months ago

Hello! I encountered the same issue while using LibSignal. May I ask if you have found a solution?

Hello! I have the same issue. Have you solved this issue?