princeton-nlp / LLM-Shearing

[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
https://arxiv.org/abs/2310.06694
MIT License
533 stars 39 forks source link

Create cleanshm.sh #30

Closed Longyichen closed 9 months ago

Longyichen commented 9 months ago

Fix the problem of normal training by clearing the cache.