DeepGraphLearning / ULTRA

A foundation model for knowledge graph reasoning
MIT License
471 stars 62 forks source link

Adding batch per epoch option to the finetune script #10

Closed daniel4x closed 9 months ago

daniel4x commented 9 months ago

Allow to specify the batch per epoch in the finetune script (Probably missed it?)

migalkin commented 9 months ago

Thanks, that's a good catch - I was usually running fine-tuning with the run_many script to schedule several repeats with different seeds to get mean/std, but it can work in the single run as well

daniel4x commented 9 months ago

It was a copy-paste mistake 😄 . I also missed the actual slicing from the for loop over the train data so the actual batch_per_epoch is still len(train_loader) as the variable is not used (it was only used for the print).

           # run.py training loop
            for batch in islice(train_loader, batch_per_epoch):
           ....

I'll create another patch for that, wdyt? @migalkin

migalkin commented 9 months ago

Sure, go ahead