lmstudio-ai / lms

LM Studio CLI. Written in TypeScript/Node
https://lms.dev
MIT License
1.08k stars 83 forks source link

LMS Hangs After Displaying "Verification succeeded. The server is running on port <number>" and Doesn't Load Model #44

Open magicxor opened 1 week ago

magicxor commented 1 week ago

Describe the bug lms hangs after displaying Verification succeeded. The server is running on port 11435. and doesn't load a model, though LM Studio itself opens successfully.

To Reproduce Steps to reproduce the behavior (when LM Studio is closed):

  1. Open PowerShell 7.4.3 (pwsh)
  2. Type: lms server start && lms load MaziyarPanahi/DARE_TIES_13B-GGUF --gpu max -y
  3. The LM Studio GUI will open. PowerShell output will be:
    PS C:\Users\unnamed> lms server start && lms load MaziyarPanahi/DARE_TIES_13B-GGUF --gpu max -y
    Attempting to start the server on port 11435...
    Launching LM Studio minimized... (Disable auto-launching via the --no-launch flag.)
    Requested the server to be started on port 11435.
    Verifying the server is running...
    Verification succeeded. The server is running on port 11435.

    LM Studio log:

    [2024-06-27 22:41:21.620] [INFO] [LM STUDIO SERVER] Verbose server logs are ENABLED
    [2024-06-27 22:41:21.641] [INFO] [LM STUDIO SERVER] Success! HTTP server listening on port 11435
    [2024-06-27 22:41:21.642] [INFO] [LM STUDIO SERVER] Supported endpoints:
    [2024-06-27 22:41:21.642] [INFO] [LM STUDIO SERVER] ->  GET  http://localhost:11435/v1/models
    [2024-06-27 22:41:21.642] [INFO] [LM STUDIO SERVER] ->  POST http://localhost:11435/v1/chat/completions
    [2024-06-27 22:41:21.643] [INFO] [LM STUDIO SERVER] ->  POST http://localhost:11435/v1/completions
    [2024-06-27 22:41:21.643] [INFO] [LM STUDIO SERVER] ->  POST http://localhost:11435/v1/embeddings     <------------ NEW!
    [2024-06-27 22:41:21.644] [INFO] [LM STUDIO SERVER] Logs are saved into C:\tmp\lmstudio-server-log.txt
  4. Type lms ls in a new PowerShell window. It hangs and doesn't show anything.
  5. Make a CURL query:
    PS C:\Users\unnamed> curl http://localhost:11435/v1/models
    {
    "data": [],
    "object": "list"
    }

Expected behavior The model should load successfully and be listed when queried.

System Info:

yagil commented 1 week ago

@magicxor thanks for the detailed bug report. Are you able to try to reproduce with 0.2.25, available from https://lmstudio.ai?

You need to open LM Studio at least once lms 0.2.25 as well

magicxor commented 1 week ago

@yagil , thank you for your prompt response.

I tried to reproduce the issue with LM Studio 0.2.25, but it ships with lms v0.2.23. The bug still persists with this setup. Could you please provide the build for lms 0.2.25?

Thank you for your assistance.

yagil commented 1 week ago

we will investigate and update here

sergeinovik commented 2 days ago

I see the same issue when I use this "cmd /c lms load bartowski/gemma-2-9b-it-GGUF/gemma-2-9b-it-Q8_0_L.gguf --gpu max -y --context-length 8192 --quiet"

It starts LM STUDIO, but it does not start the model gemma. Btw I have the same issue with other models, for ex. Llama.

I've tried to use the same command with LLM STUDIO v0.2.27, but the update does not help.

P.S.: when using cmd to run lms, I dont get the answers even for simple commands like "status" or "ls", it simply returns nothing.

P.S. 2: I've tried to run this (llama model and the same commands for lms) on v0.2.22, everything worked fine.

UPD: When I start lms on v0.2.27, it whows this info "lms - LM Studio CLI - v0.2.23"... Is that somekind of issue, or 0.2.23 is the latest version of lms ?