neuralmagic / sparseml

Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Apache License 2.0
2.05k stars 144 forks source link

Channelwise Quantization Tests #2283

Closed Satrat closed 4 months ago

Satrat commented 4 months ago
dbogunowicz commented 4 months ago

LGTM, but the runner is now arguably not running any transformers jobs:

Requested labels: k8s-eng-gpu-64G-v100-32G
Job defined at: neuralmagic/sparseml/.github/workflows/test-check.yaml@refs/pull/2283/merge
Waiting for a runner to pick up this job...