-
I had a chance to reflect after PTC / CUDA-MODE and wanted to share some thoughts on future plans for sparsity in torchao.
## **Current State**
There are two components of sparsity, accuracy and…
jcaip updated
3 weeks ago
-
## 🚀 Feature
Extension of pytorch pruning utilities to effectively allow for the simplification of pruned models (i.e. actually removing zeroed out tensors from the model).
## Motivation
Prun…
-
in the paper, you found the unimportant SD block/layer.
In that case, you may not have to retrain the model
(because if you erase unimportant block/layer, the performance is almost preserved)
Can…
-
## 🚀 Feature
support mainstream pruning techniques.
## Motivation
Recently, lots of new pruning algorithms are proposed, but the [current implementation](https://github.com/pytorch/pytorch/blob/4…
-
After merging #897 I checked to see what fails on `maxgamill-sheffield/topology` as I'm working my way through #850 and this branch is due to be merged to `maxgamill-sheffield/800-better-tracing`.
…
-
Great work team!
Currently, I am pruning on the llama2-7b-chat-hf model from hugging face.
python main.py
> --model NousResearch/Llama-2-7b-chat-hf
> --prune_method wanda
> …
-
Hi, I came across the [NMPruner](https://github.com/SeoLabCornell/torch2chip/blob/main/src/pruner/nm.py) class in your repository, particularly interested in its **_structured fine-grained sparsity_**…
-
Hi, how can I implement global unstructured pruning using this library? It seems I can only prune individual layers and not the entire model
Thanks
-
Dear MobileSAM Developers,
I hope this message finds you well. I am reaching out to discuss potential enhancements to the MobileSAM framework, particularly concerning its lightweight encoder's perf…
-
**System information**
- TensorFlow version (you are using): 2.5.0
- Are you willing to contribute it (Yes/No): Yes
**Motivation**
Deciding on where to have high filter/channel counts in con…