protectai / rebuff

LLM Prompt Injection Detector
https://playground.rebuff.ai
Apache License 2.0
1.14k stars 82 forks source link

Add support for fully local deployment #10

Open woop opened 1 year ago

woop commented 1 year ago

Step one is #16. Then to add support for a local postgres/supabase. Ideally we can have a docker compose setup that runs in a completely isolated manner with only OpenAI calls being a managed dependency. Then we can introduce a local model (even if only for testing purposes).

We should consider LangChain as an abstraction to vector databases because they already have support for multiple stores.

seanpmorgan commented 1 year ago

Commenting to say that the high level goal here to enable local development is a high priority task. We're going to move the server itself out so the Python SDK and Typescript SDK can be used locally with a vectordb (probably Chroma) and OpenAI calls / local LLM.

dvirginz commented 10 months ago

That would be a great capability. We are working in an isolated environment with only Azure OpenAI ports opened, and having the server run on-premises would help us a lot.