neuralmagic / sparseml

Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Apache License 2.0
2.01k stars 140 forks source link

[GPTQ Modifier UX] Update tests to use GPTQModifier for obcq style quantization #2294

Closed rahul-tuli closed 1 month ago

rahul-tuli commented 1 month ago

This PR updates test recipes and readme to use new GPTQ modifier for quantization