deepseek-ai / DeepSeek-V2

DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
MIT License
3.47k stars 143 forks source link

8 * A100 启动巨慢,有启动成功的勇士不 #11

Closed CarryChang closed 4 months ago

zwd003 commented 4 months ago

建议使用vllm启动https://github.com/vllm-project/vllm/pull/4650

stack-heap-overflow commented 4 months ago

HuggingFace代码中accelerate库对模型的显存分配计算有问题,目前示例代码已修改,预计大幅缩短模型加载速度。

加载模型的代码修改为:

model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True, device_map="sequential", torch_dtype=torch.bfloat16, max_memory=max_memory, attn_implementation="eager")