ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.04k stars 581 forks source link

inference_with_transformers_zh部署执行时python报错 #390

Closed Aridea2021 closed 9 months ago

Aridea2021 commented 10 months ago

提交前必须检查以下项目

问题类型

模型量化和部署

基础模型

Chinese-Alpaca-2-16K (7B/13B)

操作系统

Linux

详细描述问题

# python scripts/inference/inference_hf.py     --base_model ../hfl-chinese-alpaca-2-1.3b     --with_prompt     --interactive

依赖情况(代码类问题务必提供)

peft                      0.6.0
sentencepiece             0.1.99
torch                     2.1.0
transformers              4.35.0

运行日志或截图

#  USE_XFORMERS_ATTENTION:  True
STORE_KV_BEFORE_ROPE: False
Apply NTK scaling with ALPHA=1.0
The value of scaling factor will be read from model config file, or set to 1.
Traceback (most recent call last):
  File "/mnt/d/llama-cpp/Chinese-LLaMA-Alpaca-2-3.2/scripts/inference/inference_hf.py", line 141, in <module>
    base_model = LlamaForCausalLM.from_pretrained(
  File "/home/aridea/software/Anaconda3/envs/chat/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3396, in from_pretrained
    max_memory = get_balanced_memory(
  File "/home/aridea/software/Anaconda3/envs/chat/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 771, in get_balanced_memory
    max_memory = get_max_memory(max_memory)
  File "/home/aridea/software/Anaconda3/envs/chat/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 653, in get_max_memory
    max_memory["cpu"] = psutil.virtual_memory().available
  File "/home/aridea/software/Anaconda3/envs/chat/lib/python3.10/site-packages/psutil/__init__.py", line 1976, in virtual_memory
    ret = _psplatform.virtual_memory()
  File "/home/aridea/software/Anaconda3/envs/chat/lib/python3.10/site-packages/psutil/_pslinux.py", line 419, in virtual_memory
    mems[fields[0]] = int(float(fields[1])) * 1024
ValueError: could not convert string to float: b'kB'
iMountTai commented 10 months ago

先按照requirements要求的版本测试一下,后面我们会统一适配目前依赖的新版本,比如Transformers、gradio等

Aridea2021 commented 10 months ago

autoreply:Your mail has been received, and I will reply as soon as I canBest Regards;) 

Aridea2021 commented 10 months ago

好像是py库psutil的5.0以上高版本不支持wsl1的linux内核,pip install psutil==4.3.1就能通过了,感谢大佬~谢谢

---Original--- From: "Xin @.> Sent at: 2023年11月6日(Mon) AM8:44 To: @.>; Cc: @.**@.>; Subject: Re: [ymcui/Chinese-LLaMA-Alpaca-2] inference_with_transformers_zh部署执行时python报错 (Issue #390)

先按照requirements要求的版本测试一下,后面我们会统一适配目前依赖的新版本,比如Transformers、gradio等

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

github-actions[bot] commented 9 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 9 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.