-
老师您好!
When I fine-tune codebert, graphcodebert and unixcoder on the downstream tasks, they all have the same error, which is as follows:`==================== LOADING ====================
Loaded conf…
-
Hello,
Thanks for this awesome work! I am trying to extract the attribution score for a source code-related task. Basically, we use CodeBERT from huggingface to get the code snippet embedding and …
-
尊敬的作者您好(Self-Supervised Query Reformulation for Code Search),阅读了您精彩的论文之后,首先对您的工作感到非常敬佩,但是还是有一些疑惑:
![7E745B17@073FFE7D F8578366](https://github.com/RedSmallPanda/SSQR/assets/45999537/b07b6974-1ab4-4…
-
Thanks for sharing such a great work of Code AI.
In CodeBert paper section B.1.
"We train CodeBERT on one NVIDIA DGX-2 machine using FP16. It combines 16 interconnected
NVIDIA Tesla V100 with 3…
-
512 or 1024?
Does the CodeBert has the same maximum length of input as UniXcoder?
Thanks
-
作者您好,我想请问一下您在codebert上做了哪些微调呢
-
Follow README at CodeBERT/codesearch, I fine-tuned the model.
But, it's task type is 'classification', so, when i want to search something by model, i got a classification result, like this:
[{'…
-
The table5 in the paper shows:
![image](https://user-images.githubusercontent.com/41561936/131811060-424e20c2-f887-4adb-8c17-de84c61c06a8.png)
With following the steps in `CoCLR on Code Search` se…
-
Hi,
I am on an M1 Mac Pro, I am trying to run the step "Fine-Tune" by executing the script run.py mentioned in https://github.com/microsoft/CodeBERT/tree/master/GraphCodeBERT/codesearch .
As p…
-
Dear Madame or Sir,
Could you please provide us with a list of the projects that have been used for the training (including the evaluation) of CodeBERT? Particularly the CodeBERT-MLM task?
From t…