Closed ellipsis-dev[bot] closed 1 year ago
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
semantic-search-mini | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Oct 26, 2023 4:02pm |
[!IMPORTANT]
Auto Review Skipped
Bot user detected.
To trigger a single review, invoke the
@coderabbitai review
command.
Found 1 warnings.
/ui/shared/api.ts
Something look wrong? If this Code Review doesn't contain the expected results, you may need to update your rules. For more information, check the documentation.
Generated with :heart: by www.bitbuilder.ai.
Summary:
Issue: https://github.com/nsbradford/SemanticSearch/issues/41 Plan feedback: Approved by @nsbradford
Implementation:
/ui/shared/api.ts
file, add a new fieldsessionId
to theLLMChatCompletionRequest
interface. ThesessionId
should be of typestring
./ui/pages/index.tsx
file, modify thesendLLMRequest
function call inside thehandleNewUserPrompt
function to pass thesessionId
along with the existing data./backend/models.py
file, add a new fieldsessionId
to theLLMChatCompletionRequest
class. ThesessionId
should be of typestr
./backend/main.py
file, modify thellm
endpoint to acceptsessionId
as part of the request body. Update thellm_get
function call inside thellm
endpoint to pass thesessionId
along with the existing data./backend/llm.py
file, modify thellm_get
function to acceptsessionId
as an argument. Update themetadata
object inside theacompletion
function call to include thesessionId
.Add sessionId to LLMChatCompletionRequest interface
Added sessionId to LLMChatCompletionRequest interface in /ui/shared/api.tsPass sessionId to sendLLMRequest
Modified the sendLLMRequest function call inside the handleNewUserPrompt function in /ui/pages/index.tsx to pass the sessionId along with the existing data.Add sessionId to LLMChatCompletionRequest model
Added sessionId to LLMChatCompletionRequest class in /backend/models.pyModify llm endpoint to accept sessionId
Modified the llm endpoint in /backend/main.py to accept sessionId as part of the request body and updated the llm_get function call inside the llm endpoint to pass the sessionId along with the existing data. Also modified the llm_get function in /backend/llm.py to accept sessionId as an argument and updated the metadata object inside the acompletion function call to include the sessionId.Modify llm_get function to accept sessionId
Modified the llm_get function in /backend/llm.py to accept sessionId as an argument and updated the metadata object inside the acompletion function call to include the sessionId.Something look wrong?: If this Pull Request doesn't contain the expected changes, add more information to #41. Then, add the
bitbuilder:create
label to try again. For more information, check the documentation.Generated with :heart: by www.bitbuilder.ai