Open franky1024 opened 1 year ago
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("openbmb/cpm-bee-5b", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("openbmb/cpm-bee-5b", trust_remote_code=True).cuda() # result = model.generate({"input": "今天天气不错,", "": ""}, tokenizer) print(result)
安装环境:python3.9 transformers
您的内存配置是多少呢,加载5B模型应该要求有至少10G内存
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("openbmb/cpm-bee-5b", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("openbmb/cpm-bee-5b", trust_remote_code=True).cuda() # result = model.generate({"input": "今天天气不错,", "": ""}, tokenizer)
print(result)
安装环境:python3.9 transformers