Closed broxdeez closed 2 months ago
I see Ollama as supported LLMs in readme. But I cant seem to make it work by modifying the config.json. I modified the endpoint to localhost and changed the api_type to ollama.
I get unknown api type ollama
Sorry, ollama support is in the main branch and not released yet.
🫡 thanks for the update
I see Ollama as supported LLMs in readme. But I cant seem to make it work by modifying the config.json. I modified the endpoint to localhost and changed the api_type to ollama.
I get unknown api type ollama