Open nam1410 opened 5 days ago
Here's my suggestion for the steps you should take:
torchtune
setup and run one of our LoRA finetune configsmodel._component_
points to a function that returns your custom modelcheckpointer
section so that it can read/write your weights in the right format. You can use our torchtune checkpointer or write a custom one using this interface. Let me know if you run into any problems
We've also just added a small tutorial on adding custom components to torchtune - check it out!
https://pytorch.org/torchtune/main/basics/custom_components.html#launching-with-custom-components
If you happen to find it useful/don't useful all feedback is welcome.
Thank you for your work on finetuning LLMs using lora, dora etc. I'm wondering how I can get started to finetune my custom model with torchtune lora. Do you have any suggestions?