logancyang / obsidian-copilot

A ChatGPT Copilot in Obsidian
https://www.obsidiancopilot.com/
GNU Affero General Public License v3.0
2.37k stars 161 forks source link

TypeError after sending messages in the chat #167

Closed sopyb closed 6 months ago

sopyb commented 9 months ago

Version: 2.4.3

Setting model to LocalAI: llama-2-uncensored-q4ks
[Error: No chat model set] { isTrusted: [Getter] }
*** DEBUG INFO ***
user message: test
model: llama-2-uncensored-q4ks
temperature: 0.7
maxTokens: 1500
system message: You are Obsidian Copilot, a helpful assistant that integrates AI to Obsidian note-taking.
chat context turns: 3

Model request failed: [TypeError: Cannot read properties of undefined (reading 'llm')]

Let me know if I need to provide additional information

logancyang commented 8 months ago

Looks like this one and https://github.com/logancyang/obsidian-copilot/issues/176 are related. Somehow the chain was undefined and not initialized properly. I haven't personally reproduced this. Can you describe the steps you took?

logancyang commented 6 months ago

LocalAI has been updated to LM Studio and Ollama. Please try those options!