a16z-infra / ai-town

A MIT-licensed, deployable starter kit for building and customizing your own version of AI town - a virtual town where AI characters live, chat and socialize.
https://convex.dev/ai-town
MIT License
7.72k stars 713 forks source link

why run "just convex env set OLLAMA_HOST http://localhost:11434" result of "grep: .env.local no such file ordirectory" #248

Closed cnisno1ok closed 3 months ago

cnisno1ok commented 3 months ago

I tried using VMware for project deployment

ianmacartney commented 3 months ago

I'm not sure what the question is, but I'll explain a bit in case that helps.

Running just convex env set OLLAMA_HOST will set an environment variable in the convex backend. This env variable is pointing to the location where you're running ollama. If you're running it locally that's typically http://localhost:11434. However if you're running the backend in a container, you need to either expose the local port, or find a way for it to route to the ollama server. I don't know enough about vmware to advise on how.