Closed rperi closed 9 months ago
In our paper, we used LLaMA 13B, but due to resource constraints and for easier deployment, we retrained on the 7B LLaMA model and open source this 7B version. If we have sufficient computing resources in the future, we will retrain a better 13B version and release it as open source.
Thanks for the clarification!
The paper mentions the use of 13B LLaMA backend, however the repository uses 7B model. Which is the correct one? Am I missing something?