Closed TheaperDeng closed 2 years ago
LGTM One comment,
@basic_lightningmodule(loss_creator=loss, optim_creator=optim, config)
should we put make it all keywords argument? I think position arugment should be put in front of keywords arguments.
LGTM One comment,
@basic_lightningmodule(loss_creator=loss, optim_creator=optim, config)
should we put make it all keywords argument? I think position arugment should be put in front of keywords arguments.
sure, it's actually a typo
and there seems to be many typos :(
This decorator is implemented in #3181 . @yangw1234 Will add a simple unit test soon.
@yangw1234
related to #3171
Since we have decided to provide a pytorch-lightning "wrapper" inside nano rather than Chronos to help our users transform their pytorch
nn.module
to aLightningModule
that can be accelerated bybigdl-nano
.We (@zhentaocc) proposed a decorator
@basic_lightningmodule
to help our users to transform their pytorchnn.module
(required to have normal training loop). We may also add other decorators for GAN and other training loop in the future.API Design
additionally, if you need to use onnxruntime support, do it consistently! #3174
Possible implementation
We need
**optim_configs
because that we need to set important hparam such as lr (although the default one is good enough). And a optim can not be built before a model is built.