issues
search
jina-ai
/
jerboa
LLM finetuning
Apache License 2.0
42
stars
4
forks
source link
We need to train alpaca-lora on the same number of lora layers to be able to compare it to falcon 7B and understand the effect of changing from llama to falcon
#71
Closed
alaeddine-13
closed
1 year ago