Closed BraksWord closed 2 months ago
I'm having the same issue. I'm using a localhost via LMStudio. Everything is properly configured and was working just fine until the most recent updates. Now there is no response when using the chat.
Interesting; I will look into this and do testing with new vaults/configurations to see what the problem is. Have either of you tested/tried it out on the latest 0.6.2 update to see if it's resolved?
i had the update, but chat wasn't working until i closed the chat and used the command palette to open a new chat. works now. thanks.
Interesting; I will look into this and do testing with new vaults/configurations to see what the problem is. Have either of you tested/tried it out on the latest 0.6.2 update to see if it's resolved?
It isn't. I guess I'm doing something wrong. With the new version I opened a new chat, ask something, it doesn't answer. Maybe it comes from my gpt API that isn't correct ? But from what I see it isn't the case either. Could you help me out ?
Hi @BraksWord - I'm going to need more details to help you out. Which endpoint are you using, OpenAI / Grok / OpenRouter / Local? Which model? If you're using an API, do you have enough purchased credits with that API provider in order to use it? A screenshot or a few would really help out here. Has it worked in the past? Etc.
Hey, @SystemSculpt thx for helping me. There are as much information as I can give you. I use Open AI, I get the API there (see picture below), (FYI, I'm not paying anything, and I think the problem might come from here ?). It has never worked in the past by the way, and here is what it looks like (another picture below).
But, the fact that I can see a price, and that you asked me if I had purchase credits (and I didn't), make me think that I was the problem 😅 I thought I could Open AI inside Obsidian, like with chat GPT...😅
Thank you
Describe the question
Even though I use the correct API (i think), I have the latest version of obsidian and the plugin. I have an issue: I never get an answer from GPT in the conversation. Thanks for your answer.