ossirytk / llama-cpp-langchain-chat

The Unlicense
6 stars 2 forks source link

I get the error: UnboundLocalError: cannot access local variable 'card' where it is not associated with a value #1

Closed diegonaranjo closed 1 year ago

diegonaranjo commented 1 year ago

Hi, excellent work! I am very interested in trying this app. But i'm trying to use it as it is by default, but I'm getting this error:

llama-cpp-langchain-chat\src\llama_cpp_langchain_chat__init__.py", line 116, in parse_prompt char_name = card["name"] if "name" in card else card["char_name"] ^^^^ UnboundLocalError: cannot access local variable 'card' where it is not associated with a value

I understand this is because I am not assigned an Charracter. Is there any possibility that by default they include a standard Character? So it can be tested out of the box? Or maybe you can tell me where the error may be. Thank you so much

ossirytk commented 1 year ago

You need a character file or a prompt template. You can use https://zoltanai.github.io/character-editor/ to create a character card. Or you can search the internets for TavernAI, Pygmalion etc. character cards. Then add it to the .env file.

ossirytk commented 1 year ago

Or you can have a prompt txt file. It needs to be in the format of ### Instruction: Continue the chat dialogue below. Write {character}'s next reply in a chat between User and {character}. Write a single reply only. or ### Instruction: You are a helpful, respectful and honest assistant. Always answer as helpfully as possible. If you don't know the answer to a question, please don't share false information.

diegonaranjo commented 1 year ago

Hi! I was able to solve the problem and get it working, thank you very much for your help!!