Open lemyx opened 10 months ago
Llama-7b with QLoRA finetuned on Alpaca has different results in Table 4 and Table 5.
The former is 39.0, while the latter is 38.8
Llama-7b with QLoRA finetuned on Alpaca has different results in Table 4 and Table 5.
The former is 39.0, while the latter is 38.8