I'm doing some experiments in dataset merging at different points of training. Just getting started on this, is it possible to save a checkpoint and then load it in and continue finetuning on a different dataset as if the checkpoint is the initial? I don't care for the optimizer warmup etc since it'll be equivalent to starting a new run but on an existing adapter. A bit confused by the documentation on this.
I'm doing some experiments in dataset merging at different points of training. Just getting started on this, is it possible to save a checkpoint and then load it in and continue finetuning on a different dataset as if the checkpoint is the initial? I don't care for the optimizer warmup etc since it'll be equivalent to starting a new run but on an existing adapter. A bit confused by the documentation on this.