Closed SuperLotar closed 2 months ago
Please add support for working with local Lm studio server.
hey @SuperLotar - you can use LM Studio now!
I have added compatibility for any open ai compatible server. you can use the custom LLM provider it to set a custom base url and point that to your LM studio server.
checkout the README & cookbook for an example
Please add support for working with local Lm studio server.