SystemSculpt / obsidian-systemsculpt-ai

Enhance your Obsidian App experience with AI-powered tools for note-taking, task management, and much, MUCH more.
MIT License
58 stars 9 forks source link

Add Ollama compatibility #8

Closed ncoquelet closed 1 month ago

ncoquelet commented 1 month ago

Hello,

I use Olama locally and I quickly looked into making SystemSculpt compatible. Since Olama already supports the OpenAI API for messages, only the models retrieval needs to be adapted.

  1. The API "/v1/models" should be replaced with "/api/tags" and the mapping needs to be slightly modified as the response format is not exactly the same, for example:
    if (response.status === 200) {
    const data = response.json;
    return data.models.map((model: any) => ({
    id: model.name,
    name: model.name,
    isLocal: true,
    provider: 'local',
    }));
    }
  2. In ModelSettings.ts, I simplify getAvailableModels() to use AIService.getModels() instead the current duplicate implementation.

=> Tada, it works like a charm for me. Maybe it can help you to add ollama support in next release.

source: https://github.com/ollama/ollama/blob/main/docs/api.md#list-local-models

Regards

SystemSculpt commented 1 month ago

Great insight! Will include this in the next update, 0.4.3. Cheers! Thanks so much for the great info, highly appreciated.

SystemSculpt commented 1 month ago

@ncoquelet just implemented in the latest 0.4.5 release, let me know if everything works well (ollama doesn't allow for streaming without going through a lot of hoops currently, hopefully that changes in the future, for now it generates the response in entirety before pasting it in as the answer)