randaller / llama-chat

Chat with Meta's LLaMA models at home made easy
GNU General Public License v3.0
833 stars 118 forks source link

Train model using GPU #35

Open thefaizan opened 10 months ago

thefaizan commented 10 months ago

I have checked the training hf-training-example.py by default it trains the model using the cpu. Since I have two GPUs. If I enable the GPU in the above code. I get hte Cuda out of memory error. How can I limit it just like the inference example you provided for cuda?