Closed nizamsp closed 1 year ago
Hi @nizamsp, from the error output, I understand that you are using a model from the Transformers package. In this case, you need to save it with their method .save_pretrained()
in your custom Encoder
implementation and re-load it with .from_pretrained()
. If your encoder is not trainable, i.e., if you are training only EncoderHead
, then you don't need to save it at all, you can do pass
in the .save()
method and load the pretrained Transformers model as usual in .load()
method of your encoder.
Thanks for the quick reply. I saved the model using save_servable
and trying to serve it to get embedding via
model = SimilarityModel.load(path) and model.encode
model = Model(num_groups=dataset.get_num_industries(), lr=3e-5)
train_dataloader = GroupSimilarityDataLoader(dataset, batch_size=64, shuffle=True)
trainer = pl.Trainer(accelerator="auto", devices=1, num_nodes=1, max_epochs=30)
Quaterion.fit(
trainable_model=model,
trainer=trainer,
train_dataloader=train_dataloader,
)
model.save_servable("/data/betterapp.ai/output/cluster_trainer/quaterion/betterapp")
I am trying to make this code example given here https://github.com/qdrant/quaterion/blob/master/examples/train_startup_search.py to get embeddings after fine tuning.
I had to do the following to get the encoding. In this file https://github.com/qdrant/quaterion/blob/master/examples/train_startup_search.py the save and load are inconsistent
def save(self, output_path: str):
self.encoder.save(os.path.join(output_path, self._pretrained_name))
@classmethod
def load(cls, input_path: str) -> "Encoder":
return StartupEncoder(input_path)
I changed
self.encoder.save(os.path.join(output_path, self._pretrained_name))
to
self.encoder.save(output_path)
Hope I am doing the right fix.
Oh, you're right. This example is from early stages of Quaterion and definitely not the best possible example out there. I'll fix it.
Completed in #194
I am getting the following error.
My files are as follows:
models.py
serving.py
encoders.py
Can you please help me with this since I am stuck here for a while?