lms server start
lms load lmstudio-community/Phi-3.1-mini-128k-instruct-GGUF --gpu max -y
lms create
answering as:
? Select a scaffold to use | Node.js with TypeScript
? Project Name test
...
execution:
cd test
npm start
exception:
(node:17040) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
(Use `node --trace-warnings ...` to show where the warning was created)
Downloaded Models:
- bartowski/Phi-3.5-mini-instruct_Uncensored-GGUF/Phi-3.5-mini-instruct_Uncensored-Q4_K_S.gguf (phi3)
- nomic-ai/nomic-embed-text-v1.5-GGUF/nomic-embed-text-v1.5.Q4_K_M.gguf (nomic-bert)
- lmstudio-community/gemma-2-9b-it-GGUF/gemma-2-9b-it-Q3_K_L.gguf (gemma2)
- lmstudio-community/gemma-2-2b-it-GGUF/gemma-2-2b-it-Q4_K_M.gguf (gemma2)
- lmstudio-community/Qwen2.5-14B-Instruct-GGUF/Qwen2.5-14B-Instruct-Q4_K_M.gguf (qwen2)
(... and 2 more)
Loaded Models:
- lmstudio-community/Phi-3.1-mini-128k-instruct-GGUF/Phi-3.1-mini-128k-instruct-Q4_K_M.gguf
The meaning of life is[LMStudioClient][LLM][ClientPort] Received communication warning from the server: Received invalid creationParameter for channel, endpointName = predict, creationParameter = {"modelSpecifier":{"type":"instanceReference","instanceReference":"7Ji6gIhh4rmNY7hTrnNdFGFt"},"context":{"history":[{"role":"user","content":[{"type":"text","text":"The meaning of life is"}]}]},"predictionConfigStack":{"layers":[{"layerName":"apiOverride","config":{"fields":[{"key":"llm.prediction.temperature","value":0.7},{"key":"llm.prediction.maxPredictedTokens","value":{"checked":true,"value":100}},{"key":"llm.prediction.stopStrings","value":[]}]}},{"layerName":"completeModeFormatting","config":{"fields":[{"key":"llm.prediction.promptTemplate","value":{"type":"jinja","jinjaPromptTemplate":{"template":"{% for message in messages %}{{ message['content'] }}{% endfor %}","bosToken":"","eosToken":""},"stopStrings":[]}}]}}]}}. Zod error:
- creationParameter.history: Required
This is usually caused by communication protocol incompatibility. Please make sure you are using the up-to-date versions of the SDK and LM Studio.
Note: This warning was received from the server and is printed on the client for convenience.
Steps to reproduce:
answering as:
execution:
exception: