zhijian-liu / torchpack

A neural network training interface based on PyTorch, with a focus on flexibility
https://pypi.org/project/torchpack/
MIT License
61 stars 15 forks source link

AMP Support #16

Closed CCInc closed 3 years ago

CCInc commented 3 years ago

Hi, I wonder if we could add in AMP support now that torchsparse supports mixed precision. I think it would just require addition of GradScaler and an amp.autocast block.

zhijian-liu commented 3 years ago

This is not supposed to be natively supported in TorchPack as the philosophy of TorchPack is to not interfere with the actual forward/backward pass. Using AMP should only add a few lines to Trainer, which you can refer to https://github.com/zhijian-liu/torchpack/blob/master/examples/image-classification/core/trainers.py#L34-L44.

CCInc commented 3 years ago

I'm sorry, I did not read the _run_step close enough. I'll modify it in the spvnas source. Thanks!