BiomedSciAI / fuse-med-ml

A python framework accelerating ML based discovery in the medical field by encouraging code reuse. Batteries included :)
Apache License 2.0
134 stars 34 forks source link

Support batching variable size tensors using nested tensors #219

Open mosheraboh opened 1 year ago

mosheraboh commented 1 year ago

Is your feature request related to a problem? Please describe. Support batching variable size tensors using nested tensors (https://pytorch.org/tutorials/prototype/nestedtensor.html) To avoid padding and improve the running time.

Describe the solution you'd like Add such an option in CollateDefault as an alternative to CollateDefault.pad_all_tensors_to_same_size.

Describe alternatives you've considered N/A

Additional context N/A

SagiPolaczek commented 1 year ago

Note that currently there is a user warning (torch 1.13.0):

UserWarning: The PyTorch API of nested tensors is in prototype stage and will change in the near future.
YoelShoshan commented 1 year ago

if/when it works well it could be a very nice GRAM saver

SagiPolaczek commented 1 year ago

For ref: General NestedTensor op coverage tracking issue.