eloialonso / iris

Transformers are Sample-Efficient World Models. ICLR 2023, notable top 5%.
https://openreview.net/forum?id=vhFu1Acb0xb
GNU General Public License v3.0
804 stars 80 forks source link

About world model training #8

Closed rook86 closed 1 year ago

rook86 commented 1 year ago

Hi, thank you for your wonderful work! I have a question about world model training. If I look at world_model.py, I think you are masking the token which is the output of the tokenizer. Is the world model learning the masking problem? I think this is different from the normal world model training like presented in Dreamer, and so on. Sincerely.

eloialonso commented 1 year ago

Hi, thanks for your interest! The world model is trained to autoregressively predict the tokens of the next frame, whereas DreamerV2 predicts the full discrete latent state in one shot.

rook86 commented 1 year ago

I understand, thank you!