baichuan-inc / Baichuan-7B

A large-scale 7B pretraining language model developed by BaiChuan-Inc.
https://huggingface.co/baichuan-inc/baichuan-7B
Apache License 2.0
5.67k stars 506 forks source link

请问部署推理,最小的GPU显存需要多大呢?以及内存需要多大?[Question] #110

Open ArlanCooper opened 1 year ago

ArlanCooper commented 1 year ago

Required prerequisites

Questions

请问部署推理,最小的GPU显存需要多大呢?以及内存需要多大? int8量化后最小显存需要多少呢?int4量化之后需要的最小显存是多少呢?

Checklist

lovegit2021 commented 1 year ago

it is about 10G GPU Memory, and I had traind it in V100 for one epoch using five days.