AI-Hypercomputer / maxtext

A simple, performant and scalable Jax LLM!
Apache License 2.0
1.54k stars 295 forks source link

Support mixed-precision quantization configuration on AqtEinsum #1019

Closed lenscloth closed 2 weeks ago

lenscloth commented 2 weeks ago

self.quant_dg is provided as a dict when mixed-precision is enabled. However, current AqtQuantization.einsum didn't support quant_dg in dict.

This commit will enable mixed-precision for einsum as well.

mailvijayasingh commented 2 weeks ago

LGTM!