Open TaufeqRazakh opened 3 months ago
Hey sorry to barge in here. Had a high level question regarding how the PT2 pipeline looks for you guys. Are you planning to use torch.export ? If yes, are you gonna need backprop support during inference ?
CC'ed: @KenichiNomura
I am not sure whether any of the modules require backprop during inference in Allegro/ downstream to e3nn. Prof. Ken-ichi might know this better. But I believe we will use newer torch.export() on the PT2 trained model.
-Taufeq
On Thu, Jul 18, 2024, 5:03 PM Mit Kotak @.***> wrote:
Hey sorry to barge in here. Had a high level question regarding how the PT2 pipeline looks for you guys. Are you planning to use torch.export ? If yes, are you gonna need backprop support during inference ?
— Reply to this email directly, view it on GitHub https://github.com/TaufeqRazakh/graphnorm/issues/1#issuecomment-2237793577, or unsubscribe https://github.com/notifications/unsubscribe-auth/AH6SF4MOKRYPYTKU5KXTKU3ZNBJWLAVCNFSM6AAAAABLDRQLZSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMZXG44TGNJXG4 . You are receiving this because you authored the thread.Message ID: @.***>
To be specific from what I understand, Allegro computes forces by differentiating the energies unless that assumption was relaxed.
Update: As I was playing with the code, I realized that we can actually get the old API to compile without graph breaks by setting
import e3nn
e3nn.set_optimization_defaults(jit_script_fx=False)
Allegro/Nequip do have graph breaks but we should be good for tiny-nequip. We can even attempt doing an tiny-allegro by porting this
tiny-allegro sounds great! directly relevant to Taufeq's SC paper.
@mitkotak @KenichiNomura Yeah, porting tiny-allegro and ensuring the graph mode is correct like the eager mode sounds right to me!.
I have this reproducer repository to demonstrate the graph breaks with pt2 in norm function in e3nn as follows
I am also attaching the logs, which have all the outputs here. norm_repro_test.txt
Requirements for this repro are