ucbrise / actnn

ActNN: Reducing Training Memory Footprint via 2-Bit Activation Compressed Training
MIT License
196 stars 30 forks source link

how to deploy actnn by libtorch? #11

Closed lucify123 closed 3 years ago

lucify123 commented 3 years ago

It's really an interesting work!I just wonder how can we deploy actnn using libtorch. With libtorch or onnx( from python to c++),it will make actnn more useful.Thank you~

cjf00000 commented 3 years ago

Thanks for your interest in our work. ActNN is a library for reducing training memory footprint by compressing the saved activations in each layer. At the inference time, the layer's activations won't be saved anyways, since we don't need to compute the gradients. Therefore, ActNN won't reduce the memory footprint at inference time.