salesforce / CodeT5

Home of CodeT5: Open Code LLMs for Code Understanding and Generation
https://arxiv.org/abs/2305.07922
BSD 3-Clause "New" or "Revised" License
2.74k stars 401 forks source link

LoRA fine-tuned CodeT5+ generating random final encoder outputs in inference time #175

Open monilouise opened 3 months ago

monilouise commented 3 months ago

Hi,

I'm developing a bug detection system but realized the metrics are slightly random. I noticed that this line is returning different values for a saved model in inference time:

outputs = self.encoder.encoder(input_ids=input_ids, attention_mask=input_mask)

Considering the encoder is an instance of T5Stack.

Any clues about it?

Thanks in advance.