hi, thanks for your work,
We trained your model on our own dataset which contains lapa dataset and new images including wearing mask.
after train we got checkpoints like these in blob/states/farl/face_parsing.train_lapa_farl-b-ep64_448_refinebb directory:
2.5G 10902_2.pth
2.5G 21804_4.pth
2.5G 32706_6.pth
2.5G 43608_8.pth
2.5G 54510_10.pth
2.5G 65412_12.pth
2.5G 76314_14.pth
2.5G 87216_16.pth
4.0K _records.pth
which everyone contains model weights, optimizer and etc.
the problem is how to use these weight in facer project?
i checked facer loading proccess and it use torchscript model.
hi, thanks for your work, We trained your model on our own dataset which contains lapa dataset and new images including wearing mask. after train we got checkpoints like these in blob/states/farl/face_parsing.train_lapa_farl-b-ep64_448_refinebb directory: 2.5G 10902_2.pth 2.5G 21804_4.pth 2.5G 32706_6.pth 2.5G 43608_8.pth 2.5G 54510_10.pth 2.5G 65412_12.pth 2.5G 76314_14.pth 2.5G 87216_16.pth 4.0K _records.pth which everyone contains model weights, optimizer and etc.
the problem is how to use these weight in facer project? i checked facer loading proccess and it use torchscript model.