Closed nsbradford closed 1 year ago
To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".
/ui/pages/index.tsx
file, modify the handleNewUserPrompt
function to include the sessionId
in the sendLLMRequest
function call. The sessionId
is already being retrieved at the start of the PromptPage
function, so it can be passed directly to the sendLLMRequest
function. The modified sendLLMRequest
call should look like this: const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId: sessionId })
/ui/shared/api.ts
file, modify the sendLLMRequest
function to accept sessionId
as a parameter. The modified function should look like this: export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {...}
. Also, include sessionId
in the request payload: const response = await axios.post<{text: string}>(
${backendRootUrl}/llm/${sessionId}, data);
/backend/models.py
file, modify the LLMChatCompletionRequest
model to include sessionId
as a field. The modified model should look like this: class LLMChatCompletionRequest(BaseModel): model: str; messages: List[LLMChatCompletionMessage]; sessionId: str
/backend/main.py
file, modify the llm
endpoint to accept sessionId
as a parameter. The modified endpoint should look like this: @app.post('/llm/{sessionId}')
. Also, modify the llm
function to pass sessionId
to the llm_get
function: result = await llm_get(request.model, request.messages, sessionId)
/backend/llm.py
file, modify the llm_get
function to accept sessionId
as a parameter. The modified function should look like this: async def llm_get(model: str, messages: List[LLMChatCompletionMessage], sessionId: str) -> str
. Also, include sessionId
in the metadata of the acompletion
call: metadata={"environment": getEnvironment(), "sessionId": sessionId}
Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.
lgtm
Get sessionId from the frontend in
index.tsx
and pass it throughsendLLMRequest
inapi.ts
. Add metadata to llm endpoint inmain.py
and pass tollm.py
- you might have to editmodels.py
accordingly.