Open lix19937 opened 4 months ago
As far as I can tell, the intended use is to just locally build and install this using setup.py
in bindings/torch. Then you can use it as part of pytorch networks in Python. You need to have the CUDA toolkit available for building .cu files and to have the libtorch library (c++ version of torch) for the matching version of CUDA.
The C++ side of things is a bit more sparse. I basically had to duplicate the functionality of modules.py
and bindings.cpp
using libtorch. It would be much more convenient if the Module
class in bindings/torch/bindings.cpp were a) exposed in a header file, and b) was itself a torch::nn::Module
so that there wouldn't need to be the python middle layer.
The C++ side of things is a bit more sparse. I basically had to duplicate the functionality of
modules.py
andbindings.cpp
using libtorch. It would be much more convenient if theModule
class in bindings/torch/bindings.cpp were a) exposed in a header file, and b) was itself atorch::nn::Module
so that there wouldn't need to be the python middle layer.
We released this publicly earlier this week if you want to have a look: https://github.com/fbriggs/lifecast_public/commit/aaa1000ccfcd9bf94a143fc21300fe6607636342#diff-aef1b984940ceb407dca09e98a080c07a1cecbb1ae6b386caa0028e03e45bc48
Thanks both.
What is the purpose of this software and what are its application examples ?
Between pytorch and tensorrt ? For training or inference ?
Or to make wheels. ?