Open jayantj opened 6 years ago
@jayantj can you suggest "how to test it" in a better way? I played with Tangent, it works quite well, but that is not enough.
Tangent should hopefully be a drop-in replacement for autograd. However, tangent is new, and might require some fixes to fully work here. Please let me know what issues you run into and I'd be happy to help. On Fri, Dec 15, 2017 at 7:38 AM Menshikh Ivan notifications@github.com wrote:
@jayantj https://github.com/jayantj can you suggest some way "how to test it" in a better way? I played with Tangent, it works quite well, but that is not enough.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/RaRe-Technologies/gensim/issues/1778#issuecomment-351995840, or mute the thread https://github.com/notifications/unsubscribe-auth/AAJ4j-jONkENeS3TXRHce7FJYRPTWlKLks5tAmgngaJpZM4Q-MLb .
@alexbw thanks, we will definitely let you know if we find any problems
The current implementation makes use of autograd to auto-differentiate the loss function. As the automatic differentiation is too slow, we chose to not use this feature for training, instead computing the derivatives ourselves and adding an option to perform verification of the computed derivatives by comparing it to the autograd values every 10k iterations or so. This is a very useful feature for debugging.
Another library to consider is Tangent library (thanks to @alexbw for the suggestion!), which does ahead-of-time gradient computations. The main concern here is speed - auto-differentiation should be similar to or faster than computing the derivatives ourselves.