Closed sebastianschramm closed 7 months ago
I assume you did a full training and not LoRA?
Yes
Are you planning to share the settings/configs that you used to run the SFT and DPO to get TinyLlama/TinyLlama-1.1B-Chat-v0.6?
There is nothing to share becuase I use exactly the same setting as Zephyr. In other words, I take their repo, change the base model name, and hit enter.
Amazing work! I really like the project!
If I understand correctly, TinyLlama/TinyLlama-1.1B-Chat-v0.6 is fine-tuned following the Zephyr recipes from HF4. I assume you did a full training and not LoRA? Are you planning to share the settings/configs that you used to run the SFT and DPO to get TinyLlama/TinyLlama-1.1B-Chat-v0.6?