Open dannaf opened 1 week ago
I suppose the essence of what I am saying here is that I expect Theia AI, which has several agents as well as an orchestrator between them, to be able to take the smart route to answering an AI chat query, in this particular case being for it to understand that it can answer this faster, cheaper, better with a simple terminal command and to ideally do it (or instruct the user on it at the terminal, but really ideally Theia AI should be able to access the terminal and do it I think [or are there some serious security risks with that?]).
(And at the very least if it is still going to answer the question without terminal use, it should answer the question correctly. And ideally also quickly/cheaply.)
Thank you for this report. First of all, if you have any concrete optimizations of prompts or agents in mind, please feel free to suggest contributions. The AI support in the Theia IDE is in an early, experimental state and far from beeing optimized.
When I try to extract your report into potential actions to take, I see he following four:
Does this make sense to you?
Yes, sounds good. And thanks for putting together and launching Theia AI! (And Theia generally.)
On Wed, Oct 30, 2024, 12:41 Jonas Helming @.***> wrote:
Thank you for this report. First of all, if you have any concrete optimizations of prompts or agents in mind, please feel free to suggest contributions. The AI support in the Theia IDE is in an early, experimental state and far from beeing optimized.
When I try to extract your report into potential actions to take, I see he following four:
- Workspace agents does not answer the question correctly => This could be due to the size of the context window, this should be checked.
- Enable agents to use the terminal => This is an interesting route, but potentially pretty dangerous without the user reviewing commands. So I would rather display them to the user and let the user execute them then
- Enable a terminal chat agent => We have the terminal assistant embedded in the terminal view, but currently no chat agent for creating terminal commands, but we should have one
- More intelligent orchestrator => The flow you describe makes perfect sense. This is an optimization of the orchestrator prompt/flow in a way that it would use the terminal chat agent (not yet existing) for this example request.
Does this make sense to you?
— Reply to this email directly, view it on GitHub https://github.com/eclipse-theia/theia/issues/14361#issuecomment-2446521717, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAN2PSKBFOORSNXMWO5DQULZ6CZT5AVCNFSM6AAAAABQXGGW5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINBWGUZDCNZRG4 . You are receiving this because you authored the thread.Message ID: @.***>
Bug Description:
The Theia AI Workspace Agent is slow and inaccurate on a basic 'smoke test', when asked a simple question: "how many files are in this repo?" while open in a workspace containing the theia repo.
Steps to Reproduce:
openai/gpt-4o-mini
.https://github.com/eclipse-theia/theia
and open it in theia as a workspace.files = <paste>
and counting withlen(files)
says there are 3757 entries in this list.Partial view of what the list of files used as intermediary by Workspace Agent looked like (GitHub prevented the entire list due to a 65536 character limit on comment length)
``` ["file:///home/project/theia/.editorconfig","file:///home/project/theia/.eslintrc.js","file:///home/project/theia/.git","file:///home/project/theia/.github","file:///home/project/theia/.gitignore","file:///home/project/theia/.gitpod.dockerfile","file:///home/project/theia/.gitpod.yml","file:///home/project/theia/.npmignore","file:///home/project/theia/.theia","file:///home/project/theia/.vscode","file:///home/project/theia/.yarnrc","file:///home/project/theia/CHANGELOG.md","file:///home/project/theia/CODE_OF_CONDUCT.md","file:///home/project/theia/CONTRIBUTING.md","file:///home/project/theia/LICENSE-EPL","file:///home/project/theia/LICENSE-GPL-2.0-ONLY-CLASSPATH-EXCEPTION","file:///home/project/theia/LICENSE-vscode.txt","file:///home/project/theia/NOTICE.md","file:///home/project/theia/README.md","file:///home/project/theia/SECURITY.md","file:///home/project/theia/configs","file:///home/project/theia/configs/base.eslintrc.json","file:///home/project/theia/configs/base.tsconfig.json","file:///home/project/theia/configs/build.eslintrc.json","file:///home/project/theia/configs/errors.eslintrc.json","file:///home/project/theia/configs/mocharc.yml","file:///home/project/theia/configs/nyc.json","file:///home/project/theia/configs/typedoc.json","file:///home/project/theia/configs/warnings.eslintrc.json","file:///home/project/theia/configs/xss.eslintrc.json","file:///home/project/theia/dependency-check-baseline.json","file:///home/project/theia/dev-packages","file:///home/project/theia/dev-packages/application-manager","file:///home/project/theia/dev-packages/application-manager/.eslintrc.js","file:///home/project/theia/dev-packages/application-manager/README.md","file:///home/project/theia/dev-packages/application-manager/package.json","file:///home/project/theia/dev-packages/application-manager/src","file:///home/project/theia/dev-packages/application-manager/src/application-package-manager.ts","file:///home/project/theia/dev-packages/application-manager/src/application-process.ts","file:///home/project/theia/dev-packages/application-manager/src/expose-loader.ts","file:///home/project/theia/dev-packages/application-manager/src/generator","file:///home/project/theia/dev-packages/application-manager/src/generator/abstract-generator.ts","file:///home/project/theia/dev-packages/application-manager/src/generator/backend-generator.ts","file:///home/project/theia/dev-packages/application-manager/src/generator/frontend-generator.ts","file:///home/project/theia/dev-packages/application-manager/src/generator/index.ts","file:///home/project/theia/dev-packages/application-manager/src/generator/webpack-generator.ts","file:///home/project/theia/dev-packages/application-manager/src/index.ts","file:///home/project/theia/dev-packages/application-manager/src/package.spec.ts","file:///home/project/theia/dev-packages/application-manager/src/rebuild.ts","file:///home/project/theia/dev-packages/application-manager/tsconfig.json","file:///home/project/theia/dev-packages/application-package","file:///home/project/theia/dev-packages/application-package/.eslintrc.js","file:///home/project/theia/dev-packages/application-package/README.md","file:///home/project/theia/dev-packages/application-package/package.json","file:///home/project/theia/dev-packages/application-package/src","file:///home/project/theia/dev-packages/application-package/src/api.ts","file:///home/project/theia/dev-packages/application-package/src/application-package.spec.ts","file:///home/project/theia/dev-packages/application-package/src/application-package.ts","file:///home/project/theia/dev-packages/application-package/src/application-props.ts","file:///home/project/theia/dev-packages/application-package/src/environment.ts","file:///home/project/theia/dev-packages/application-package/src/extension-package-collector.ts","file:///home/project/theia/dev-packages/application-package/src/extension-package.ts","file:///home/project/theia/dev-packages/application-package/src/index.ts","file:///home/project/theia/dev-packages/application-package/src/json-file.ts","file:///home/project/theia/dev-packages/application-package/src/npm-registry.ts","file:///home/project/theia/dev-packages/application-package/tsconfig.json","file:///home/project/theia/dev-packages/cli","file:///home/project/theia/dev-packages/cli/.eslintrc.js","file:///home/project/theia/dev-packages/cli/README.md","file:///home/project/theia/dev-packages/cli/bin","file:///home/project/theia/dev-packages/cli/bin/theia","file:///home/project/theia/dev-packages/cli/bin/theia-patch.js","file:///home/project/theia/dev-packages/cli/package.json","file:///home/project/theia/dev-packages/cli/patches","file:///home/project/theia/dev-packages/cli/patches/%40phosphor%2Bwidgets%2B1.9.3.patch","file:///home/project/theia/dev-packages/cli/patches/%40theia%2Bmonaco-editor-core%2B1.83.101.patch","file:///home/project/theia/dev-packages/cli/src","file:///home/project/theia/dev-packages/cli/src/check-dependencies.ts","file:///home/project/theia/dev-packages/cli/src/download-plugins.ts","file:///home/project/theia/dev-packages/cli/src/run-test.ts","file:///home/project/theia/dev-packages/cli/src/test-page.ts","file:///home/project/theia/dev-packages/cli/src/theia.ts","file:///home/project/theia/dev-packages/cli/tsconfig.json","file:///home/project/theia/dev-packages/ffmpeg","file:///home/project/theia/dev-packages/ffmpeg/.eslintrc.js","file:///home/project/theia/dev-packages/ffmpeg/README.md","file:///home/project/theia/dev-packages/ffmpeg/binding.gyp","file:///home/project/theia/dev-packages/ffmpeg/native","file:///home/project/theia/dev-packages/ffmpeg/native/ffmpeg.c","file:///home/project/theia/dev-packages/ffmpeg/native/ffmpeg.h","file:///home/project/theia/dev-packages/ffmpeg/native/linux-ffmpeg.c","file:///home/project/theia/dev-packages/ffmpeg/native/mac-ffmpeg.c"] ```Additional Information
When I asked with the Perplexity API model (
llama-3.1-sonar-small-128k-online
), I received the following different and not useful answer (not useful in that it did not execute this procedure and provide me with an answer):My point in mentioning this is that the proper expected result, properly, in my opinion, is that the AI agent would perform some combination of these answers: it should (and has the ability to, it seems) understand that it can run one simple terminal command could answer the question, and it should perform this command (similar to how it performs getWorkspaceFileList, although it is perhaps a bit different as that function is presumably hardcoded in theia?) and return the output of that command.
This is both a convenience matter, in that the answer should be provided near-instantly at the speed of non-AI computation of a simple terminal command (when I ran the suggested
git ls-files | wc -l
in the theia terminal it returned instantly — but interestingly with a different number than my python counting of the getWorkspaceFilesList... 3106), but it is also a cost issue: there is no need to send a list of 3757 filenames to the AI model, which the API charges the user for per-token.Another issue, when I enabled all the AI agents, each with
gpt-4o-mini
, and I asked the chat (so that it goes initially to the orchestrator, although I did not actually see the orchestrator in action) it still used the workspace agent but now gave a different answer:System info