TreB1eN / InsightFace_Pytorch

Pytorch0.4.1 codes for InsightFace
MIT License
1.73k stars 423 forks source link

how to load pre-trained model and continue training? #34

Closed Royzon closed 5 years ago

Royzon commented 5 years ago

How to change your code to be able to load and train the model, then continue training ? I tried to modify 'Learner.py' as follows, but it didn't work.

def train(self, conf, epochs): self.load_state(conf, '××××××××××××××.pth', True, True)

Error message is: AttributeError: 'PosixPath' object has no attribute 'seek'. You can only torch.load from a file that is seekable. Please pre-load the data into a buffer like io.BytesIO and try to load from it instead.

Can anybody here give me some good advice?

ruiming46zrm commented 5 years ago

Three suggestions: 1 、 is there a model _ xxxx.pth file in save folder and do you use the right fixed str? 2 、 not model only, please also load head and optimizer 3 、 if you load a muliti-GPU produced model , use following method to load(don't change head and optimizer loading code):

    def load_state(self, conf, fixed_str, from_save_folder=False, model_only=False):
        if from_save_folder:
            save_path = conf.save_path
        else:
            save_path = conf.model_path

        state_dict = torch.load(save_path / 'model_{}'.format(fixed_str))
        from collections import OrderedDict
        new_state_dict = OrderedDict()
        for k, v in state_dict.items():
            name = k[7:]
            new_state_dict[name] = v
        self.model.load_state_dict(new_state_dict)
        if not model_only:
            self.head.load_state_dict(torch.load(save_path/'head_{}'.format(fixed_str)))
            self.optimizer.load_state_dict(torch.load(save_path/'optimizer_{}'.format(fixed_str)))
Royzon commented 5 years ago

Thanks , it is really helpful to me.

ANDRESHZ commented 1 year ago

Hi @Royzon , I hope you are having nice results, I am wondering if you finally found a way to tune with your dataset.

Can you share with the community your code?