Closed daphnecor closed 3 weeks ago
Use this instead of dev: https://github.com/PufferAI/PufferLib/tree/gpudrive It's a fork I just made off of dev right now with no changes. I break dev a lot and will probably break it even more in the process of preparing for 1.1. You can pin 1.1 after I finish that
Description
Integration of Pufferlib for fast PPO. Additionally, it restructures algorithms into integrations for improved naming.
Readme: https://github.com/Emerge-Lab/gpudrive/tree/integrations/smoketested-puffer-ppo
Todo
cpu
andcuda
Notes
pufferlib
gpudrive
branch. To install it, useBenchmarking
SB3
SB3 PPO run command to train on a single scene with 3 controlled agents
SB3 PPO run command to train on a single dense scene with [many] controlled agents
Puffer