lmstudio-ai / .github

34 stars 3 forks source link

Running multiple models at the same time? #7

Open pachacamac opened 6 months ago

pachacamac commented 6 months ago

Would love to run a small llm like mistral 7b as one server and a small vision model like obsidian as another. Could be via separate endpoints/ports or even "api key".

PS: So far starting lm studio twice works well as a hack