Closed atonkamanda closed 3 years ago
And if the instanciate is too complicated, how can I use the inference.py file to get my predictions in a file? Like I give the model a string of the query and its output me the code predicted
As discussed in e-mail, you can refer to this issue
Thank you very much for the fast answer.
Hello I managed to finetune codebert for the codesearch task, and I was wondering how I could use the model.bin I just created to perform codesearch on my own natural langage queries.
So the model.bin is a state dictionnary containing all the weight but I don't get how I am supposed to go from that to a working model.
I tried to instanciate it with
model = RobertaModel.from_pretrained("microsoft/codebert-base")
but it returns the following error
etc,etc
And I noticed that there is a Model class in model.py so it is probably the one I mused use but I don't get how am I supposed to instanciate it in order for the model to work ? Can you show me an example or explain me a little bit ?
Thank you very much