lmstudio-ai / lmstudio.js

LM Studio TypeScript SDK (pre-release public alpha)
https://lmstudio.ai/docs/lmstudio-sdk/quick-start
Apache License 2.0
271 stars 42 forks source link

Zod errors on getModelInfo and loadModel endpoints #9

Closed Trippnology closed 2 months ago

Trippnology commented 2 months ago

LM Studio version: 0.2.21 SDK version: 0.0.12

I have the LMS GUI open, the local server running, and a model loaded.

const model = await client.llm.get({});

Causes the following error:

W [LMStudioClient][LLM][ClientPort] Produced communication warning: Received invalid result for rpc, endpointName = getModelInfo, result = {"identifier":"bartowski/Starling-LM-7B-beta-GGUF/Starling-LM-7B-beta-Q8_0.gguf","path":"bartowski/Starling-LM-7B-beta-GGUF/Starling-LM-7B-beta-Q8_0.gguf"}. Zod error:

- result.sessionIdentifier: Required
- result.descriptor: Required

This is usually caused by communication protocol incompatibility. Please make sure you are using the up-to-date versions of the SDK and LM Studio.

If I try to load a model by name, I get a different error:

const model = await client.llm.load('bartowski/Starling-LM-7B-beta-GGUF');
W [LMStudioClient][LLM][ClientPort] Received communication warning from the server: Received invalid creationParameter for channel, endpointName = loadModel, creationParameter = {"path":"bartowski/Starling-LM-7B-beta-GGUF","config":{},"noHup":false}. Zod error:

- creationParameter.acceleration: Required

This is usually caused by communication protocol incompatibility. Please make sure you are using the up-to-date versions of the SDK and LM Studio.

Note: This warning was received from the server and is printed on the client for convenience.
raphiki commented 2 months ago

I have the same kind of errors, it might be because we don't have the 0.2.22 version of LM Studio yet...

async function useModel(prompt) {
    /* var llama3 = await lmstudio.llm.get({
        path: "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf",
    }); */
    var llama3 = await lmstudio.llm.get({ path: "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF" });

    var result = await llama3.complete(prompt);
    console.log(result.content);
    console.log(result.stats);
}

useModel("Tell me a joke");
W [LMStudioClient][LLM][ClientPort] Produced communication warning: Received invalid result for rpc, endpointName = getModelInfo, result = {"identifier":"lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf","path":"lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf"}. Zod error:

- result.sessionIdentifier: Required
- result.descriptor: Required

This is usually caused by communication protocol incompatibility. Please make sure you are using the up-to-date versions of the SDK and LM Studio.
raphiki commented 2 months ago

It has been confirmed on Discord, we have to wait for 0.2.22 version: That is the cause. Stay tuned for an announcement soon

Trippnology commented 2 months ago

OK, no worries. Thank you for reporting back here too.

yagil commented 2 months ago

@Trippnology @raphiki please try with 0.2.22 from the homepage: https://lmstudio.ai. If it still doesn't work, please let us know

Trippnology commented 2 months ago

Pleased to report it's working great on v0.2.22!

Thank you for all your hard work on this fantastic project!