Open CHesketh76 opened 2 weeks ago
Hi, I'm sorry but I don't think RTX 3070Ti has sufficient memory size for training or running LongWriter model. We train on 8xH800 (80GB) for full fine-tuning (LoRA and quantization may reduce the memory utilization).
I have a 3070Ti and was wondering if running this training pipeline on consumer grade hardware is possible. If not, then what is the recommend hardware requirement and cost of training?