⚠NOTE: I've put the wrong ticket number on the branch name.
Description
The final PR for #824 !
This one is to stop the getModel calls. We now return the current configured model in getStart or getLevel (but only for sandbox).
Screenshots
Before
Refreshing on sandbox (preview mode, so only single render):
changing level to sandbox (preview mode, so only single render):
After
Refreshing on sandbox (preview mode, so only single render):
changing level to sandbox (preview mode, so only single render):
Notes
Here's the whole situation because it's a bit convoluted:
In the backend session we store chatModel. This records the id of the model that we want to chat to as well as the configurable settings (temperature, top p, etc). But we use the default chatModel rather than this configured version for all levels apart from sandbox. So really what this value represents is the chatModel used in sandbox (despite the name and it not being part of the levelState). I think this is confusing, but I tried to change it and it snowballed into a bit of a mess, so 'm calling it a problem for another day.
Anyhoo, we need to get the configured chatModel whenever we load up or switch to the sandbox. So I've attached a chatModel attribute to the StartResponse and LoadLevelResponse. But because this is only relevant for sandbox, on all other levels this value is undefined.
Meanwhile in the frontend, in MainComponent we consume these response objects. We set the chatModel useState value, and then drill that down all the way to the ModelSelection and ModelConfiguration components, which show up in the control panel in sandbox.
I've opted to sync the frontend's copy of the chatModel with the backend's one, i.e. it will only store the configured chatModel used in the sandbox.
Concerns
I'm sure there are nicer, less confusing ways to organise all the data, but i don't think it's this ticket's job to do that. (read: I tried to reorganise it but it grew arms and legs, so I'm choosing to revert and then ignore the problem)
I wonder if I should rename chatModel across the front and back end to something like sandboxChatModel just to make it really clear what it's there for.
⚠NOTE: I've put the wrong ticket number on the branch name.
Description
The final PR for #824 ! This one is to stop the getModel calls. We now return the current configured model in getStart or getLevel (but only for sandbox).
Screenshots
Before
Refreshing on sandbox (preview mode, so only single render):![image](https://github.com/ScottLogic/prompt-injection/assets/118171430/25b83e1d-bc5a-495e-b1e0-dd523c7037d1)
changing level to sandbox (preview mode, so only single render):![image](https://github.com/ScottLogic/prompt-injection/assets/118171430/7373d72b-8b5a-482f-a0d2-336df676cdba)
After
Refreshing on sandbox (preview mode, so only single render):![image](https://github.com/ScottLogic/prompt-injection/assets/118171430/c2226faa-5af5-4949-9414-288ed7f87d85)
changing level to sandbox (preview mode, so only single render):![image](https://github.com/ScottLogic/prompt-injection/assets/118171430/2dbff165-b9f9-4073-ab6d-2c854caf9283)
Notes
Here's the whole situation because it's a bit convoluted:
chatModel
. This records the id of the model that we want to chat to as well as the configurable settings (temperature, top p, etc). But we use the defaultchatModel
rather than this configured version for all levels apart from sandbox. So really what this value represents is thechatModel
used in sandbox (despite the name and it not being part of thelevelState
). I think this is confusing, but I tried to change it and it snowballed into a bit of a mess, so 'm calling it a problem for another day.StartResponse
andLoadLevelResponse
. But because this is only relevant for sandbox, on all other levels this value is undefined.MainComponent
we consume these response objects. We set thechatModel
useState value, and then drill that down all the way to theModelSelection
andModelConfiguration
components, which show up in the control panel in sandbox.chatModel
with the backend's one, i.e. it will only store the configured chatModel used in the sandbox.Concerns
chatModel
across the front and back end to something likesandboxChatModel
just to make it really clear what it's there for.Checklist
Have you done the following?