Closed wertyuilife2 closed 1 month ago
It does not solve much but rb._storage._storage
is the same as rb[:]
.
Is a bit more intuitive and clean (maybe you could call update()
on rb[:][info["index"]]
)
@albertbou92 yes, I use rb._storage._storage['key'][index_list]
in practice.
@albertbou92, oh I got you wrong. I mean it's clean to use rb[:], I agree. But I don't think it's a way that anyone will naturally figure out(and it should be called with buffer._replay_lock
). Also, TensorStorage.set() seems like a usable method, but it doesn't actually act as expected, which may confuse others who use it.
right, makes sense!
Motivation
This issue comes from the original issue #2205.
My work requires modifying the contents of the buffer. Specifically, I need to sample an item, modify it, and put it back in the buffer. However, torchrl currently does not seem to encourage modifying buffer contents. When calling
buffer._storage.set(index, data)
to put my modified data back into the buffer, it implicitly changes_storage._len
, which can cause the sampler to sample empty samples. The following code demonstrates this issue:I resolved this by directly modifying
buffer._storage._storage
while holding thebuffer._replay_lock
. It took me two days to discover thatTensorStorage.set()
implicitly changes_len
. I believe this method should behave more intuitively. I am not sure if otherStorage
classes have similar issues, butTensorStorage
definitely does.Solution
Provide a method that can modify ReplayBuffer in place, like Replaybuffer.set(index, data).
Additional context
See discussion in the original issue #2205.
Checklist