Closed m-lyon closed 2 years ago
I'm afraid not because PyTorch does not name any tensor.
Probably we can manage a tensor -> name
mapping in this tool, but it requires considerable amount of code changes.
I would recommend you to report the memory usage at multiple critical locations so you can have a deeper understand the memory system underneath.
a = torch.Tensor(5,5,5)
reporter.report()
b = a + 5
reporter.report()
c = b * 5
reporter.report()
I'm developing a custom network layer and it subsequently has many unnamed Tensors within the MemReporter output. below is a snippet example
Is there any way to name or label these tensors (Tensors 0 through 10 for example) so that I can more easily determine which operation specifically creates them?