pytorch / rl

A modular, primitive-first, python-first PyTorch library for Reinforcement Learning.
https://pytorch.org/rl
MIT License
2.27k stars 302 forks source link

[BUG] `InitTracker` key flexibility #2326

Closed matteobettini closed 2 months ago

matteobettini commented 2 months ago

Two questions about the following code

https://github.com/pytorch/rl/blob/0063741839a3e5e1a527947945494d54f91bc629/torchrl/envs/transforms/transforms.py#L6437-L6440

vmoens commented 2 months ago

For 1) IIRC yes there was, because we assume that the done is at the root and that the root batch-size is indicative of the env structure. What would it mean to have a nested "init"? Happy to see a refactoring if it is well defined.

For 2) I don't know, if the value isn't used anywhere we can remove it

matteobettini commented 2 months ago

The transform seems to be ok with multiple reset and init keys (the only thing it needs to do is just match an init key for any reset key)

It is just the init method that needs some refreshing I think.