Closed zinwelzl closed 9 months ago
I haven't tried LM Studio but it looks like it can run an OpenAI-compatible local server. So set that up, and then set the llm_base_url
and llm_model
like you would for litellm:
(from https://github.com/AbanteAI/rawdog?tab=readme-ov-file#model-selection)
Do you have instruction how to connect RawDog and LM Studion?