Open mosheraboh opened 1 year ago
Note that currently there is a user warning (torch 1.13.0):
UserWarning: The PyTorch API of nested tensors is in prototype stage and will change in the near future.
if/when it works well it could be a very nice GRAM saver
Is your feature request related to a problem? Please describe. Support batching variable size tensors using nested tensors (https://pytorch.org/tutorials/prototype/nestedtensor.html) To avoid padding and improve the running time.
Describe the solution you'd like Add such an option in CollateDefault as an alternative to CollateDefault.pad_all_tensors_to_same_size.
Describe alternatives you've considered N/A
Additional context N/A