Open PhilKes opened 5 months ago
Same question!Looking forward to the model based on llama3!
It would be great to have it based on llama 3 😄
Same question.
Apparently Llama 3 has already been trained on a lot more code than Llama 2.
So, do we need a full blown Codellama 3 model, or do you think a FIM fine-tune of Llama 3 would be sufficient?
Apparently Llama 3 has already been trained on a lot more code than Llama 2.
So, do we need a full blown Codellama 3 model, or do you think a FIM fine-tune of Llama 3 would be sufficient?
Would love to see a FIM fine-tune of Llama 3, I dont have any insights on how the training process differed from Llama 2. Is there anyone working on a FIM fine-tune? Haven't seen one yet on HF
Would love to see a FIM fine-tune of Llama 3, I dont have any insights on how the training process differed from Llama 2. Is there anyone working on a FIM fine-tune? Haven't seen one yet on HF
Supposedly StarCoder2 should be perfect for FIM 😅
Would also love to see a Code Llama Version based on LLama 3. Llama 3 70b seems to perform better on coding-Tasks than Code Llama 70b.
3.1:70b will be :fire:
+1
+1 there'e even fresher model which impress
Is there going to be an updated version of Codellama based on Meta's new LLaMa 3 ?