mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
22.91k stars 1.74k forks source link

Template failed loading: template: prompt:1:8: executing "prompt" at <.RoleName>: can't evaluate field RoleName in type model.PromptTemplateData #1151

Closed yourtiger closed 10 months ago

yourtiger commented 10 months ago

LocalAI version:

quay.io/go-skynet/local-ai:latest

Environment, CPU architecture, OS, and Version:

Linux localhost.localdomain 3.10.0-1160.99.1.el7.x86_64 https://github.com/go-skynet/LocalAI/issues/1 SMP Wed Sep 13 14:19:20 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

Describe the bug

I user Llama2-chat-13B and use postman to ask questions。LocalAI error image

To Reproduce

postman pic image

models folder image

llama2.yaml image

llama2-chat.tmpl image

llama2-completion.tmpl image

Expected behavior

Logs

Additional context

localai-bot commented 10 months ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

I'm sorry to hear that you are having issues with LocalAI. From the information provided, it seems like there might be an issue with the models or the configuration files. To troubleshoot this further, I would recommend checking the following:

  1. Ensure that the models directory is in the correct location and has the correct permissions.
  2. Check the llama2.yaml and llama2-chat.tmpl files to ensure they are correctly formatted and contain the correct information.
  3. Run LocalAI with the --debug flag to generate more detailed logs.
  4. Check the OpenAI API key and host settings in the frontend configuration file to ensure they are correct. If you are still having issues after checking these steps, please provide more details or log files so we can better assist you.

Sources:

yourtiger commented 10 months ago

https://localai.io/advanced/ Now I know the reason, if system_prompt is used in the template, then system_promptmust be defined in the yaml file of the model

zenyanbo commented 7 months ago

https://localai.io/advanced/ 现在我知道原因了,如果模板中使用了system_prompt,那么system_prompt必须定义在模型的yaml文件中

Hello, I I also encountered this problem. It's shows executing "prompt" at <.RoleName>: can't evaluate field RoleName in type model.PromptTemplateData. My templ is

<|im_start|>{{if eq .RoleName "assistant"}}assistant{{else if eq .RoleName "system"}}system{{else if eq .RoleName "user"}}user{{end}}
{{if .Content}}{{.Content}}{{end}}
<|im_end|>

How do you let it work? Can you provide me with your configuration file for reference? Thx

yourtiger commented 7 months ago

我不清楚LocalAI的模版是否支持RoleName,或者你写的格式。我参考的模版支持参数是https://localai.io/advanced/上的说明 我当时遇到的问题是,在我的模版中使用了system_prompt变量,但是没有在启动的llama2.yaml文件中定义system_prompt,所以,我只需要在llama2.yaml文件中增加system_prompt的说明如下: image

zenyanbo commented 7 months ago

Oh, I see RoleName in your config llama2.yaml. So, the error disappears after you add system_prompt, it means LocalAI support RoleName? I also saw similar jinja format in examples, but it report an error as you.