Open AlexandrNP opened 8 months ago
According to the latest candlezation requirements, the checkpointing path is managed by the CANDLE library:
ckpt = candle.CandleCkptPyTorch(vars(self.args)) ckpt.set_model({"model": self.model, "optimizer": opt}) ... ckpt.ckpt_epoch(epo, float(train_loss))
During cross-study runs, model checkpointing defaults to the same directory, preventing training different models, as the training procedure restarts from the same checkpoint.
According to the latest candlezation requirements, the checkpointing path is managed by the CANDLE library:
During cross-study runs, model checkpointing defaults to the same directory, preventing training different models, as the training procedure restarts from the same checkpoint.