Closed nsbradford closed 1 year ago
To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".
/ui/shared/api.ts
, add a new property sessionId
of type string
to the LLMChatCompletionRequest
interface. The updated interface should look like this:export interface LLMChatCompletionRequest {
model: string;
messages: ChatCompletionRequestMessage[];
sessionId: string;
}
/ui/shared/api.ts
, modify the sendLLMRequest
function to include sessionId
in the request data. The updated function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {
const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId});
return response.data.text;
}
This change ensures that the sessionId
is included in the request data when the sendLLMRequest
function is called.
/ui/pages/index.tsx
, update the call to sendLLMRequest
in the handleNewUserPrompt
function to include sessionId
. The updated function call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results) }, sessionId)
This change ensures that the sessionId
is passed to the sendLLMRequest
function when it is called.
Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.
replan
To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".
/ui/shared/api.ts
, add a new property sessionId
of type string
to the LLMChatCompletionRequest
interface. The updated interface should look like this:export interface LLMChatCompletionRequest {
model: string;
messages: ChatCompletionRequestMessage[];
sessionId: string;
}
/ui/shared/api.ts
, modify the sendLLMRequest
function to include sessionId
in the request data. The updated function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
return response.data.text;
}
This change ensures that the sessionId
is included in the request data when the sendLLMRequest
function is called.
/ui/pages/index.tsx
, update the call to sendLLMRequest
in the handleNewUserPrompt
function to include sessionId
. The updated function call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });
This change ensures that the sessionId
is passed to the sendLLMRequest
function when it is called.
Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.
replan
To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".
/ui/shared/api.ts
, add a new property sessionId
of type string
to the LLMChatCompletionRequest
interface. The updated interface should look like this:export interface LLMChatCompletionRequest {
model: string;
messages: ChatCompletionRequestMessage[];
sessionId: string;
}
/ui/shared/api.ts
, modify the sendLLMRequest
function to include sessionId
in the request data. The updated function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
return response.data.text;
}
This change ensures that the sessionId
is included in the request data when the sendLLMRequest
function is called.
/ui/pages/index.tsx
, update the call to sendLLMRequest
in the handleNewUserPrompt
function to include sessionId
. The updated function call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });
This change ensures that the sessionId
is passed to the sendLLMRequest
function when it is called.
Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.
replan
To have BitBuilder create a Pull Request with the implementation, the user who created the issue (@nsbradford) can comment below with "LGTM". If the plan is inaccurate, edit the issue description and write a comment with "replan".
/ui/shared/api.ts
, add a new property sessionId
of type string
to the LLMChatCompletionRequest
interface. The updated interface should look like this:export interface LLMChatCompletionRequest {
model: string;
messages: ChatCompletionRequestMessage[];
sessionId: string;
}
/ui/shared/api.ts
, modify the sendLLMRequest
function to include sessionId
in the request data. The updated function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest): Promise<string> {
const response = await axios.post<{text: string}>(`${backendRootUrl}/llm/`, {...data, sessionId: data.sessionId});
return response.data.text;
}
This change ensures that the sessionId
is included in the request data when the sendLLMRequest
function is called.
/ui/pages/index.tsx
, update the call to sendLLMRequest
in the handleNewUserPrompt
function to include sessionId
. The updated function call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId });
This change ensures that the sessionId
is passed to the sendLLMRequest
function when it is called.
Generated with :heart: by www.bitbuilder.ai. Questions? Check out our the documentation.
Get sessionId from the frontend - see
index.tsx
.Requires adding metadata to llm endpoint and passing through tollm.py