sjquan / 2022-Study

56 stars 8 forks source link

[10/25] 이준형 LLM.int8(): 8-bit Matrix Multiplication for Transformers at Scale #9

Closed veritas9872 closed 1 year ago

veritas9872 commented 1 year ago

Date: 2022.10.25 Presenter: Joonhyung Lee

ArXiv: https://arxiv.org/abs/2208.07339 Blog: https://timdettmers.com/2022/08/17/llm-int8-and-emergent-features Blog: https://huggingface.co/blog/hf-bitsandbytes-integration GitHub: https://github.com/TimDettmers/bitsandbytes