-
Dear Madame or Sir,
Could you please provide us with a list of the projects that have been used for the training (including the evaluation) of CodeBERT? Particularly the CodeBERT-MLM task?
From t…
-
您好!word2vec将每个词转换成不同的向量表示,因此我认为word2vec是基于word-level训练的模型,而codebert可以将一段代码表示成向量的形式,请问我能将codebert和graphcodebert视为基于sequence-level训练的模型吗?
-
你好!看到readme中说“Bert之类的预训练模型开在另一个仓库下”,请问有仓库地址吗~
-
Thanks for your great work.
I have tried to train the comment classifier based on your instrutions., i.e., finetuned the dataset using CodeBert. However,
the test results on the test set are:
![im…
-
-
Thanks for your great work! I have a few questions:
1. Is the Implementations part model reimplemented by yourselves, or is it the official open source implementation collected?
2. The Deepcs link f…
-
Hello,
I have noticed that capabilities to load Pytorch `bin` or `pt` pickle archives seems to now be possible with the library which is a great addition. I have just tried this new feature and I a…
-
hi,
I downloaded the model from https://huggingface.co/microsoft/codebert-base/tree/main and using to run an inference (without fine-tuning). But unable to load the model file pytorch_model.bin as th…
-
I'm trying to finetune the concode task using 'code' as both input & output, instead of 'nl' & 'code'. I wanted to know if we can directly use the concode finetuned checkpoints of concode task and som…
-
1. > The high variability would suggest a content-dependent head, while low variability would indicate a content-independent head.
> Figure 7: Visualization of attention heads in CodeBERT, along w…