Closed frankqianms closed 1 week ago
This is a duplicate of: https://github.com/microsoft/teams-ai/issues/1522
Ideally the sample needs to be updated to a non-sequence Augmentation and non-monologue augmentation sample. For example, if you were to update the code to match our twenty questions sample, the sample would work better. we just don't have bandwidth at the moment to update the sample.
If you do end up making these changes, please also file a PR!
This is a duplicate of: #1522
Ideally the sample needs to be updated to a non-sequence Augmentation and non-monologue augmentation sample. For example, if you were to update the code to match our twenty questions sample, the sample would work better. we just don't have bandwidth at the moment to update the sample.
If you do end up making these changes, please also file a PR!
Got it! Thanks for sharing the information.
Language
Javascript/Typescript
Version
JS 1.3.0
Description
Run sample https://github.com/microsoft/teams-ai/tree/main/js/samples/03.ai-concepts/e.customModel-LLAMA and got the following error in log.
[onTurnError] unhandled error: Error: Reached max model response repair attempts. Last feedback given to model: "Return a JSON object that uses the SAY command to say what you're thinking." Error: Reached max model response repair attempts. Last feedback given to model: "Return a JSON object that uses the SAY command to say what you're thinking." at LLMClient.repairResponse (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\LLMClient.ts:488:24) at processTicksAndRejections (node:internal/process/task_queues:95:5) at async LLMClient.repairResponse (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\LLMClient.ts:495:16) at async LLMClient.repairResponse (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\LLMClient.ts:495:16) at async LLMClient.completePrompt (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\LLMClient.ts:347:28) at async ActionPlanner.completePrompt (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\ActionPlanner.ts:250:16)
at async ActionPlanner.continueTask (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\ActionPlanner.ts:178:24) at async ActionPlanner.beginTask (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\planners\ActionPlanner.ts:153:16) at async AI.run (C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\AI.ts:367:28) at async C:\Users\frankqian\TeamsApps\e.customModel-LLAMA\node_modules\@microsoft\teams-ai\src\Application.ts:823:21
Reproduction Steps