Open longyangqi opened 5 months ago
Great work! However, training on my own data consumes too much GPU memory. As fp16 is supported in Memory-efficient inference, can we also train the model with fp16?
Thanks!
Great work! However, training on my own data consumes too much GPU memory. As fp16 is supported in Memory-efficient inference, can we also train the model with fp16?
Thanks!