Closed matthiasgeihs closed 1 year ago
Yes - thanks for the question! I am planning on adapting the multi-GPU tutorial from Hugging Face that uses the FDSP method for multi-GPU training. Planning on doing sft with mpt-30b and perhaps falcon-40b with this multi-gpu tutorial. Don't have a timeline on when this will be ready though, could be a week or a month depending on my free time.
^ you could try to edit the above yourself - if you do, lmk!
Thx for the response. I'll let you know if I try and how things are going.
Hey, found this repository via your hugging face model
dfurman/mpt-7b-instruct-openorca
. This looks very useful!I am currently mostly working with 2xA100 40GB. Are there any plans to enhance the scripts to make finetuning work on multiple GPUs?