sgomez / ollama-ai-provider

Vercel AI Provider for running LLMs locally using Ollama
https://www.npmjs.com/package/ollama-ai-provider
Other
150 stars 18 forks source link

Add llama3.1 #10

Closed GoodbyePlanet closed 3 months ago

GoodbyePlanet commented 3 months ago

Hi @sgomez

Thank you for creating ollama provider.

I would like to add support for a new llama3.1 models. Is changing ollama-chat-settigs.ts enough to add support for llama3.1 models?

sgomez commented 3 months ago

@GoodbyePlanet you can use any model available in ollama just updating the client. That file helps with autocompleting in your IDE but it accepts any string.

See https://github.com/sgomez/ollama-ai-provider/blob/main/packages/ollama/src/ollama-chat-settings.ts#L60

I hope have time to launch a new release this weekend to review the new features in the SDK and Ollama.

GoodbyePlanet commented 3 months ago

Thank you :)