I have a few questions regarding your optimization procedures.
If I understand correctly, i could use any torch.optim optimizer as opt argument, because they possess a step method, correct? As I am not so familiar with JAX, is this also true for JAX? Maybe you could provide a short example for this in the docs (or at least I couldn't find it)?
Regarding the point above, I was wondering why you reimplemented Adam and RMSProp in https://github.com/yhtang/FunFact/blob/4e5694f7c9881223fcb41fcb21e49007586aa779/funfact/optim.py. They are included in Pytorch, so reimplementing them seems a bit counterintuitive to me. Even though these algorithms are pretty simple, this is maybe an unnecessary source of errors. Is there any reason for using by default Adam from torch or some JAX optimizers depending on the active backend?
This is related to the review of FunFact for JOSS (see https://github.com/openjournals/joss-reviews/issues/4502)
I have a few questions regarding your optimization procedures.
torch.optim
optimizer asopt
argument, because they possess astep
method, correct? As I am not so familiar with JAX, is this also true for JAX? Maybe you could provide a short example for this in the docs (or at least I couldn't find it)?Adam
andRMSProp
in https://github.com/yhtang/FunFact/blob/4e5694f7c9881223fcb41fcb21e49007586aa779/funfact/optim.py. They are included in Pytorch, so reimplementing them seems a bit counterintuitive to me. Even though these algorithms are pretty simple, this is maybe an unnecessary source of errors. Is there any reason for using by default Adam fromtorch
or some JAX optimizers depending on the active backend?Thank you in advance for your help!