PufferAI / PufferLib

Simplifying reinforcement learning for complex game environments
https://pufferai.github.io/
MIT License
1.04k stars 41 forks source link

Changing dtype to a subarray type error on env.reset for Crafter env #81

Closed robflynnyh closed 2 months ago

robflynnyh commented 2 months ago

Hi trying to get pufferlib working with the crafter environment (gym) and am running into an error which is included below. Also I have created a colab to reproduce the error here: https://colab.research.google.com/drive/1IwRoPr2noB0C2H3PR-g5ShY1qzQcJE8Z?usp=sharing

Hopefully I am not doing something wrong :P

""" Traceback (most recent call last): File "main.py", line 167, in main(args=args) File "main.py", line 70, in main envs.reset() File "/home/robertflynn/anaconda3/envs/ML/lib/python3.8/site-packages/pufferlib/vector.py", line 45, in reset vecenv.async_reset(seed) File "/home/robertflynn/anaconda3/envs/ML/lib/python3.8/site-packages/pufferlib/vector.py", line 130, in async_reset ob, i = env.reset(seed=s) File "/home/robertflynn/anaconda3/envs/ML/lib/python3.8/site-packages/pufferlib/emulation.py", line 175, in reset self.obs_struct = self.obs.view(self.obs_dtype) ValueError: Changing the dtype to a subarray type is only supported if the total itemsize is unchanged """

jsuarez5341 commented 2 months ago

On it!

jsuarez5341 commented 2 months ago

@robflynnyh The issue is that crafter is a dated env that uses an old API. PufferLib provides pufferlib.environments with working wrapped versions of these envs. Fixes your issue in the colab. Check it out here: https://github.com/PufferAI/PufferLib/blob/1.0/pufferlib/environments/crafter/environment.py

If this helped and you haven't already starred the repo, please feed the puffer a star

robflynnyh commented 2 months ago

All working now thanks, also had to update pytorch and python