Closed aqibsaeed closed 1 year ago
Kind of. Please use https://github.com/vwxyzjn/ppo-implementation-details/blob/main/ppo_multidiscrete_mask.py
I did not include it in CleanRL because I did not want to include the gym-microrts dependency.
Thanks alot!
Hi, Thanks for the great work. Could you please let me know, if the current PPO implementation has action masking?