issues
search
lxe
/
simple-llm-finetuner
Simple UI for LLM Model Finetuning
MIT License
2.05k
stars
132
forks
source link
Clarify that 16GB VRAM in itself is enough
#21
Closed
vadi2
closed
1 year ago
vadi2
commented
1 year ago
Seems to be working OK on my RTX 4080 with 16GB VRAM exactly.
Seems to be working OK on my RTX 4080 with 16GB VRAM exactly.