Is it possible to continue the initial autoregressive pre-training on a custom dataset, as was done for Code Llama - Python? This would in principle allow for the fine-tuning of Code Llama models in other programming languages. If so, would you please provide an example training script? Any information or help would be much appreciated!
Is it possible to continue the initial autoregressive pre-training on a custom dataset, as was done for Code Llama - Python? This would in principle allow for the fine-tuning of Code Llama models in other programming languages. If so, would you please provide an example training script? Any information or help would be much appreciated!