supabase-community / seed

Automatically seed your database with production-like dummy data based on your schema for local development and testing.
MIT License
483 stars 18 forks source link

Ollama support #198

Open michaelmior opened 2 months ago

michaelmior commented 2 months ago

Feature request

Is your feature request related to a problem? Please describe.

I'd like to be able to use a local model with Ollama.

Describe the solution you'd like

I want to be able to use a local Ollama instance with seed.

Describe alternatives you've considered

Cloud based models require an Internet connection as well as an account with a third party. It would be possible to use something like LocalAI which provides an OpenAI-compatible API for local models, but this is a pretty heavyweight solution.

Additional context

LangChain.js already has Ollama support. It seems like this could be as simple as allowing an OLLAMA_MODEL environment variable that can be used to construct a ChatOllama instance.

BRAVO68WEB commented 2 months ago

+1

chirag3003 commented 2 months ago

+1

Ba3a-G commented 1 month ago

+1 :rocket: