Open lovodkin93 opened 1 year ago
Hey, Are there any plans to add support for mixed precision training? I did see in #12 a temporary solution was suggested, but it still throws multiple exceptions relating to mathematical operations between fp16 and fp32 values. Thanks! @rajcscw
Hey we are working on the support of hugging face's Accelerate. With that mixed precision training would be possible.
May I ask if there is a complete code with changes that I can learn from?
Hey, Are there any plans to add support for mixed precision training? I did see in #12 a temporary solution was suggested, but it still throws multiple exceptions relating to mathematical operations between fp16 and fp32 values. Thanks! @rajcscw