nsbradford / SemanticSearch

Minimal RAG (Retrieval Augmented Generation) website with Pinecone, FastAPI, NextJS, MongoDB
https://semantic-search-six.vercel.app
9 stars 3 forks source link

Add sessionId to llm request metadata #14

Closed nsbradford closed 1 year ago

nsbradford commented 1 year ago

Get sessionId from the frontend - see index.tsx.Requires adding metadata to llm endpoint and passing through to llm.py

ellipsis-dev[bot] commented 1 year ago

To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".


Implementation Steps

  1. Add sessionId to LLMChatCompletionRequest interface
    • In the file /ui/shared/api.ts, add a new property sessionId of type string to the LLMChatCompletionRequest interface. The updated interface should look like this:
export interface LLMChatCompletionRequest {
  model: string;
  messages: ChatCompletionRequestMessage[];
  sessionId: string;
}
  1. Modify sendLLMRequest function to include sessionId
    • In the file /ui/shared/api.ts, modify the sendLLMRequest function to include sessionId in the request data. The updated function should look like this:
export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {
  const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId});
  return response.data.text;
}

This change ensures that the sessionId is included in the request data when the sendLLMRequest function is called.

  1. Update sendLLMRequest call in index.tsx
    • In the file /ui/pages/index.tsx, update the call to sendLLMRequest in the handleNewUserPrompt function to include sessionId. The updated function call should look like this:
const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results) }, sessionId)

This change ensures that the sessionId is passed to the sendLLMRequest function when it is called.


Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.

nsbradford commented 1 year ago

replan

ellipsis-dev[bot] commented 1 year ago

To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".


Implementation Steps

  1. Add sessionId to LLMChatCompletionRequest interface
    • In the file /ui/shared/api.ts, add a new property sessionId of type string to the LLMChatCompletionRequest interface. The updated interface should look like this:
export interface LLMChatCompletionRequest {
  model: string;
  messages: ChatCompletionRequestMessage[];
  sessionId: string;
}
  1. Modify sendLLMRequest function to include sessionId
    • In the file /ui/shared/api.ts, modify the sendLLMRequest function to include sessionId in the request data. The updated function should look like this:
export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
  const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
  return response.data.text;
}

This change ensures that the sessionId is included in the request data when the sendLLMRequest function is called.

  1. Update sendLLMRequest call in index.tsx
    • In the file /ui/pages/index.tsx, update the call to sendLLMRequest in the handleNewUserPrompt function to include sessionId. The updated function call should look like this:
const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });

This change ensures that the sessionId is passed to the sendLLMRequest function when it is called.


Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.

nsbradford commented 1 year ago

replan

ellipsis-dev[bot] commented 1 year ago

To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".


Implementation Steps

  1. Add sessionId to LLMChatCompletionRequest interface
    • In the file /ui/shared/api.ts, add a new property sessionId of type string to the LLMChatCompletionRequest interface. The updated interface should look like this:
export interface LLMChatCompletionRequest {
  model: string;
  messages: ChatCompletionRequestMessage[];
  sessionId: string;
}
  1. Modify sendLLMRequest function to include sessionId
    • In the file /ui/shared/api.ts, modify the sendLLMRequest function to include sessionId in the request data. The updated function should look like this:
export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
  const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
  return response.data.text;
}

This change ensures that the sessionId is included in the request data when the sendLLMRequest function is called.

  1. Update sendLLMRequest call in index.tsx
    • In the file /ui/pages/index.tsx, update the call to sendLLMRequest in the handleNewUserPrompt function to include sessionId. The updated function call should look like this:
const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });

This change ensures that the sessionId is passed to the sendLLMRequest function when it is called.


Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.

nsbradford commented 1 year ago

replan

ellipsis-dev[bot] commented 1 year ago

To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".


Implementation Steps

  1. Add sessionId to LLMChatCompletionRequest interface
    • In the file /ui/shared/api.ts, add a new property sessionId of type string to the LLMChatCompletionRequest interface. The updated interface should look like this:
export interface LLMChatCompletionRequest {
  model: string;
  messages: ChatCompletionRequestMessage[];
  sessionId: string;
}
  1. Modify sendLLMRequest function to include sessionId
    • In the file /ui/shared/api.ts, modify the sendLLMRequest function to include sessionId in the request data. The updated function should look like this:
export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
  const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
  return response.data.text;
}

This change ensures that the sessionId is included in the request data when the sendLLMRequest function is called.

  1. Update sendLLMRequest call in index.tsx
    • In the file /ui/pages/index.tsx, update the call to sendLLMRequest in the handleNewUserPrompt function to include sessionId. The updated function call should look like this:
const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });

This change ensures that the sessionId is passed to the sendLLMRequest function when it is called.


Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.