promptengineers-ai / llm-server

🤖 Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://promptengineersai.netlify.app
32 stars 8 forks source link

Support for SupaBase Vectorstore #130

Closed ryaneggz closed 4 months ago

ryaneggz commented 4 months ago

The main database was recently migrated to Supabase. App now supports MySQL, SQLite, and Postgres (which Supabase is built on).

The current Vector database being used by https://promptengineersai.netlify.app is Redis which is hosted on a VM of mine.

Would like to support Supabase so that vectors and data can be in one place and have less dependencies overall.

https://python.langchain.com/v0.2/docs/integrations/vectorstores/supabase/#maximal-marginal-relevance-searches

The only thing Redis will be used for after do this will be as a broker for streaming status's back to client. This small piece can be moved to Redis Cloud after as the broker does not persist any data can get away with free version there.

ryaneggz commented 4 months ago

Implemented PGVector and currently don't see any reason to support Supabase vectorstore. Samething under the hood would imagine.