ScottLogic / prompt-injection

Application which investigates defensive measures against prompt injection attacks on an LLM, with a focus on the exposure of external tools.
MIT License
16 stars 10 forks source link

828 streamline chat model configuration info message network call #875

Closed pmarsh-scottlogic closed 6 months ago

pmarsh-scottlogic commented 6 months ago

Description

Before, when the user would configure the chatModel, we would send a request to the backend. If the backend successfully configured the model, then the frontend would send a further request to add the info message to the chat history, eg 'changed frequencyPenalty to 0.6'.

This PR changes it so that when the backend successfully configures the chatModel, the backend generates the info message and appends it to the history, then forwards it on to the frontend.

Notes

Checklist

Have you done the following?