SystemSculpt / obsidian-systemsculpt-ai

Enhance your Obsidian App experience with AI-powered tools for note-taking, task management, and much, MUCH more.
MIT License
95 stars 13 forks source link

LM Stuido Endpoint not working. #58

Open KnoBuddy opened 2 months ago

KnoBuddy commented 2 months ago

I can't get the endpoint to work properly with LM Studio. I have tried adding /v1 and /v1/chat/completions. Both http://localhost:1234 and http://localhost:1234/v1 return the same output. /v1/chat/completions returns nothing. The server works fine in smart connections and in several other apps and python scripts I've used.

Here is the server output:

[2024-09-02 06:45:09.000] [INFO] Received GET request to /v1/models with body: {} [2024-09-02 06:45:09.000] [INFO] Returning { "data": [ { "id": "lmstudio-community/DeepSeek-Coder-V2-Lite-Instruct-GGUF/DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf", "object": "model", "owned_by": "organization-owner", "permission": [ {} ] } ], "object": "list" } [2024-09-02 06:45:09.004] [ERROR] Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway [2024-09-02 06:45:10.653] [INFO] [LM STUDIO SERVER] Processing queued request... [2024-09-02 06:45:10.653] [INFO] Received OPTIONS request to /v1/chat/completions with body: {} [2024-09-02 06:45:10.653] [ERROR] 'messages' field is required

My settings in SystemSculpt:

image

StraightOuttaCrompton commented 2 months ago

➕ Same issue

SystemSculpt commented 1 month ago

You should only be putting in the URL, the client handles the /v1/etc stuff automatically, have you tried just putting in the url and port, ex. http://localhost:1234 or whatever port you have it set to?

StraightOuttaCrompton commented 1 month ago

Yup, the error occurs when putting just the url http://localhost:1234 without any path

wsuff commented 1 month ago

I experienced this as well. LM Studio would end up showing an empty request as mentioned above. After opening dev tools and trying it again, it turns out the issue isn't due to the systemsculpt plugin at all but CORS policy as set by default in LM Studio. If you "Enable CORS" before starting the LM Studio API server, requests from systemsculpt plugin should work as expected.

Even the tooltip in LM Studio 0.3.2 seems to indicate that some integrations may require Enable CORS turned on "Enabling CORS (Cross-origin Resource Sharing) would allow websites you visit to make requests to LM Studio server.

CORS might be required when making requests from a web page or VS Code / other extension."

Console Log from Obsidian Developer Console

Access to fetch at 'http://localhost:1234/v1/chat/completions' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
JohnBeres commented 1 week ago

Same error - yup. You need to enable CORS