The problem is that the save_pretrained method of the model does not save the script with the class definition and so it throws an error that the module for the head is missing when loaded into the inference environment.
(See recommended way to save pytorch model: https://stackoverflow.com/a/43819235)
Is there any way to avoid this, because it is not a good option to copy these scripts between repositories?
Does anyone know if I would face the same problem if I convert to ONNX first and use an ONNX model for inference?
I trained a SetFitModel with a custom Pytorch head which is more complex than the inbuild and gives far superior results and I followed the instructions for building it from here: https://huggingface.co/docs/setfit/en/how_to/classification_heads
The problem is that the save_pretrained method of the model does not save the script with the class definition and so it throws an error that the module for the head is missing when loaded into the inference environment. (See recommended way to save pytorch model: https://stackoverflow.com/a/43819235)
Is there any way to avoid this, because it is not a good option to copy these scripts between repositories? Does anyone know if I would face the same problem if I convert to ONNX first and use an ONNX model for inference?
Thanks!