eloialonso / iris

Transformers are Sample-Efficient World Models. ICLR 2023, notable top 5%.
https://openreview.net/forum?id=vhFu1Acb0xb
GNU General Public License v3.0
804 stars 80 forks source link

Size mismatch when evluating world model #14

Closed mamengyiyi closed 1 year ago

mamengyiyi commented 1 year ago

Hi, I run your codes on using python src/main.py env.train.id=BreakoutNoFrameskip-v4 common.device=cuda:0. However, I encountered following issue when evaluating the world model for the first time: image Could you please help me fix this problem?

eloialonso commented 1 year ago

Hi, we did not face this issue. Can you share your modifications to the code if any?

mamengyiyi commented 1 year ago

Hi, we did not face this issue. Can you share your modifications to the code if any?

Thanks for your reply! The only modifications are reducing the default batch_num_samples of tokenizer, world model and actor_critic to 64, 32 and 32. It's strange that when I rerun the codes, this issue disappear. I will report it here if I encounter it again in the future.