pfrankov / obsidian-local-gpt

Local Ollama and OpenAI-like GPT's assistance for maximum privacy and offline access
MIT License
265 stars 18 forks source link

Error while generating text : undefined #19

Closed stumads closed 1 month ago

stumads commented 2 months ago

This plugin has been incredible and I have been using it regularly. I have had a problem this morning with every action (both default and custom) generating the error above. The drop down menu appears as usual. I am using the OpenAI comp server, medium creativity, server url is https://api.openai.com, default model is 4o, however I have changed the models to see if there is a difference. I have renewed the API key to check whether this was a problem and I have checked the API funding.

Error appears whether selecting text or simply with cursor after text - Hot key has also been checked.

Thanks again for an awesome plugin. Stu

nirfse commented 1 month ago

I have just installed this plugin and immediately ran into the same problem as soon as I set the custom hotkey for a context menu. If I call context menu via command palette (i.e. Ctrl+P → Local GPT: Show context menu), everything works as expected.

Edit: disregard my message above, the problem occurs intermittently in both cases.

pfrankov commented 1 month ago

That's not enough information to debug or as I'd say "It works for me". Please try to use Ollama server https://ollama.com/blog/openai-compatibility and check if the issue persists.

nirfse commented 1 month ago

Ok, this makes sense. I was using Groq Cloud API (https://api.groq.com/openai/v1) which is OpenAI API-compatible, but using LocalGPT resulted in a couple of working requests for five non-working "json parsing error: undefined" ones.

Switching to Ollama caused other unrelated issue with the plugin (it was trying to request completions API with plugin's ID as a TLD), but after reinstalling it works just fine. I've tried ~20 shots, including a couple of custom prompts, and I was no longer able to reproduce the issue.

@stumads are you using Ollama or some custom API? If the latter, is this API endpoint supported by the plugin?

stumads commented 1 month ago

That's not enough information to debug or as I'd say "It works for me". Please try to use Ollama server https://ollama.com/blog/openai-compatibility and check if the issue persists.

Thank you for coming back to me on this. Without doing anything at all, it just started working again. Many thanks Stu

stumads commented 1 month ago

@stumads are you using Ollama or some custom API? If the latter, is this API endpoint supported by the plugin?

@nirfse - remained with standard OpenAPI and it just started to work without any intervention.