lmstudio-ai / lmstudio.js

LM Studio TypeScript SDK (pre-release public alpha)
https://lmstudio.ai/docs/lmstudio-sdk/quick-start
Apache License 2.0
271 stars 42 forks source link

Server model alias #11

Closed amartins23 closed 2 months ago

amartins23 commented 2 months ago

Some applications expect specific model names when connecting to OpenAI. Since LM Studio presents an OpenAI compatible API, but restricts the model names to a specific structure, it would be useful if when starting the server an alias could be specified for the model name (similar to the --alias parameter in llama.cpp server https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md ).

ryan-the-crayon commented 2 months ago

Hi, I believe we already support this via the "identifier".

ryan-the-crayon commented 2 months ago

Closing this for now. Feel free to reopen if these does not address your needs.

amartins23 commented 2 months ago

I'm sorry, but I don't see any place in the UI to set the identifier either in the "My Models" page or in the "Server" page?

Server

My Models

ryan-the-crayon commented 2 months ago

I see. Though, if you are loading models from the chat/server page, you can only load one at the same time. In which case, our OpenAI compatible server will not check for model name. That is, specifying any string as the model would resolve to that one single loaded model. Does this address your needs?

amartins23 commented 1 month ago

The LM Studio server may not check for the model name, but the software that calls the server will check the model listing and refuse to work if the listed model name does not match what it expects.

ryan-the-crayon commented 1 month ago

I see. Is it possible to disable this check in the software you are using? If not, you might need to load models via the multimodel playground or with the CLI. Both methods allow you to specify an identifier (i.e. alias).