promptengineers-ai / llm-server

🤖 Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://promptengineersai.netlify.app
32 stars 8 forks source link

Configure to deploy to cloud run #104

Closed ryaneggz closed 4 months ago

ryaneggz commented 4 months ago

This could mean that we need to setup a database somewhere as I believe Cloud Run is for stateless containers so our current SQL Lite will not work.

ryaneggz commented 4 months ago

Is deployed