issues
search
OpenThaiGPT
/
openthaigpt-pretraining
Apache License 2.0
21
stars
10
forks
source link
feat(model): Add quantization load
#283
Open
boss-chanon
opened
1 year ago
boss-chanon
commented
1 year ago
Why this PR
Add quantization load for QLoRA
Changes
Add quantization load 8 bit
Add quantization load 4 bit
Load model to Memory instead GPUs
Related Issues
Close #
Checklist
[ ] PR should be in the
Naming convention
[ ] Assign yourself in to Assigneees
[ ] Tag related issues
[ ] Constants name should be ALL_CAPITAL, function name should be snake_case, and class name should be CamelCase
[ ] complex function/algorithm should have
Docstring
[ ] 1 PR should not have more than 200 lines changes (Exception for test files). If more than that please open multiple PRs
[ ] At least PR reviewer must come from the task's team (model, eval, data)
Why this PR
Add quantization load for QLoRA
Changes
Related Issues
Close #
Checklist