ScottLogic / prompt-injection

Application which investigates defensive measures against prompt injection attacks on an LLM, with a focus on the exposure of external tools.
MIT License
16 stars 10 forks source link

Simplified route to internal enterprise deployment #867

Closed chrisprice closed 6 months ago

chrisprice commented 6 months ago

Feature Request

Description

There should be minimal overhead for an enterprise wishing to deploy this to their internal users. Currently there's only a local deployment model and a cloud-hosted version optimised for SL's use case.

Additional context

Ideally this could be made as simple as build this single container from source/download this image (depending on security preferences of enterprise), and run it with these instructions (expose port, provide API secret, etc.).

Acceptance criteria

GIVEN an enterprise is keen to deploy this for their internal users WHEN they look for installation instructions THEN they find instructions which are as simple as possible with minimal assumptions about their infrastructure

chriswilty commented 6 months ago

It's trivial to allow express to serve our UI, while still allowing the UI to be served separately if need be. We can provide both a single "npm start" style command from the project root, and a dockerfile to run in a container, so that the app can be built and started

This should keep everyone happy.

I will re-work the authentication side of things in a subsequent ticket, as I would really like to move the (opt-in) auth integration from the UI code into the cloud infra.