Closed ncoquelet closed 1 month ago
Great insight! Will include this in the next update, 0.4.3. Cheers! Thanks so much for the great info, highly appreciated.
@ncoquelet just implemented in the latest 0.4.5 release, let me know if everything works well (ollama doesn't allow for streaming without going through a lot of hoops currently, hopefully that changes in the future, for now it generates the response in entirety before pasting it in as the answer)
Hello,
I use Olama locally and I quickly looked into making SystemSculpt compatible. Since Olama already supports the OpenAI API for messages, only the models retrieval needs to be adapted.
ModelSettings.ts
, I simplifygetAvailableModels()
to useAIService.getModels()
instead the current duplicate implementation.=> Tada, it works like a charm for me. Maybe it can help you to add ollama support in next release.
source: https://github.com/ollama/ollama/blob/main/docs/api.md#list-local-models
Regards