Closed GoodbyePlanet closed 3 months ago
@GoodbyePlanet you can use any model available in ollama just updating the client. That file helps with autocompleting in your IDE but it accepts any string.
I hope have time to launch a new release this weekend to review the new features in the SDK and Ollama.
Thank you :)
Hi @sgomez
Thank you for creating ollama provider.
I would like to add support for a new llama3.1 models. Is changing
ollama-chat-settigs.ts
enough to add support for llama3.1 models?