Closed Trippnology closed 2 months ago
I have the same kind of errors, it might be because we don't have the 0.2.22 version of LM Studio yet...
async function useModel(prompt) {
/* var llama3 = await lmstudio.llm.get({
path: "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf",
}); */
var llama3 = await lmstudio.llm.get({ path: "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF" });
var result = await llama3.complete(prompt);
console.log(result.content);
console.log(result.stats);
}
useModel("Tell me a joke");
W [LMStudioClient][LLM][ClientPort] Produced communication warning: Received invalid result for rpc, endpointName = getModelInfo, result = {"identifier":"lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf","path":"lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf"}. Zod error:
- result.sessionIdentifier: Required
- result.descriptor: Required
This is usually caused by communication protocol incompatibility. Please make sure you are using the up-to-date versions of the SDK and LM Studio.
It has been confirmed on Discord, we have to wait for 0.2.22 version: That is the cause. Stay tuned for an announcement soon
OK, no worries. Thank you for reporting back here too.
@Trippnology @raphiki please try with 0.2.22 from the homepage: https://lmstudio.ai. If it still doesn't work, please let us know
Pleased to report it's working great on v0.2.22!
Thank you for all your hard work on this fantastic project!
LM Studio version: 0.2.21 SDK version: 0.0.12
I have the LMS GUI open, the local server running, and a model loaded.
Causes the following error:
If I try to load a model by name, I get a different error: