Closed ellipsis-dev[bot] closed 3 months ago
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
semantic-search-mini | ❌ Failed (Inspect) | Sep 26, 2023 8:03pm |
Sorry, BitBuilder encountered an error while addressing comments in this Pull Request. Please try again later. (wflow_hTrxhBkVLD7U4m7d) :robot:
Summary:
Issue: https://github.com/nsbradford/SemanticSearch/issues/36
Implementation:
/ui/pages/index.tsx
file, modify thehandleNewUserPrompt
function to include thesessionId
in thesendLLMRequest
function call. ThesessionId
is already being retrieved at the start of thePromptPage
function, so it can be passed directly to thesendLLMRequest
function. The modifiedsendLLMRequest
call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId: sessionId })
/ui/shared/api.ts
file, modify thesendLLMRequest
function to acceptsessionId
as a parameter. The modified function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {...}
. Also, includesessionId
in the request payload:const response = await axios.post<{text: string}>(
${backendRootUrl}/llm/${sessionId}, data);
/backend/models.py
file, modify theLLMChatCompletionRequest
model to includesessionId
as a field. The modified model should look like this:class LLMChatCompletionRequest(BaseModel): model: str; messages: List[LLMChatCompletionMessage]; sessionId: str
/backend/main.py
file, modify thellm
endpoint to acceptsessionId
as a parameter. The modified endpoint should look like this:@app.post('/llm/{sessionId}')
. Also, modify thellm
function to passsessionId
to thellm_get
function:result = await llm_get(request.model, request.messages, sessionId)
/backend/llm.py
file, modify thellm_get
function to acceptsessionId
as a parameter. The modified function should look like this:async def llm_get(model: str, messages: List[LLMChatCompletionMessage], sessionId: str) -> str
. Also, includesessionId
in the metadata of theacompletion
call:metadata={"environment": getEnvironment(), "sessionId": sessionId}
Plan Feedback: Approved by @nsbradford
Something look wrong?: If this Pull Request doesn't contain the expected changes, add more information to #36. Then, add the
bitbuilder:create
label to try again. For more information, check the documentation.Generated with :heart: by www.bitbuilder.ai