Closed dialuser closed 3 weeks ago
Hello, this situation is due to us encapsulating the pre trained Timer with a layer of code called TimerBackbone.py
. The instance of Timer will use the module of self.backbone
in its member variable, which is equivalent to directly using Timer. Therefore, this layer of encapsulation does not affect the actual functionality.
Hi,
I fine tuned model by setting args.is_finetuning to 1. Then I tried to do testing only by loading the saved fine tuned model, which is saved under checkpoints/[modelstring]/checkpoint.pth. However, the current code in Timer's init() didn't work. It tried to load model.backbone weights from pth.
Why not load the model weights for the entire Timer model? Did I do this correctly?
Thanks