Closed caijx168 closed 3 months ago
使用的是 LLaMA-Factory 提供的推理脚本,命令如下CUDA_VISIBLE_DEVICES=0 API_PORT=8001 nohup python src/api_demo.py \ --model_name_or_path /home/qwen/Qwen2-7B \ --template qwen \ --infer_backend vllm \ --vllm_maxlen 32000 \ --vllm_enforce_eager &
说明数据集是蒸馏openai模型训练的。你用的是base model,不是instructed模型。instructed模型是根据阿里的价值观对其的。用了chat模型后记得问一句where is Jack Ma?
这不是很正常吗?ai最贵的就是数据,啥都从头弄那要到哪一年去了。
This issue has been automatically marked as inactive due to lack of recent activity. Should you believe it remains unresolved and warrants attention, kindly leave a comment on this thread.
modelscope 下载的Qwen2-7B,下载模型地址为git clone https://www.modelscope.cn/qwen/Qwen2-7B.git为啥推理的时候,我问他的名字和开发者,回答的是chatgpt和openai