I've added a gptel-complete command for a "copilot" or coding assistant-style workflow. This feature is on the "copilot" branch of gptel.
If you are interested in this feature, I would be glad to get your feedback on it. There are two aspects to the feature:
The quality of the completions.
The quality of the UI and the user experience.
This testing focuses on the latter, so please let me know how you find the UI.
The former depends heavily on what context we send to the LLM. Right now the context is limited to everything in the current before before (point), so the quality will not be good. We will fix this soon, once I start working on the context UI. We can use LSP/eglot, imenu and project.el to find a reasonable amount of context to include with requests.
I've started a discussion for this feature, please use that to respond -- this page is only an announcement, it will be locked.
Further details on how to use gptel-complete are in the discussion page.
I've added a
gptel-complete
command for a "copilot" or coding assistant-style workflow. This feature is on the "copilot" branch of gptel.If you are interested in this feature, I would be glad to get your feedback on it. There are two aspects to the feature:
This testing focuses on the latter, so please let me know how you find the UI.
The former depends heavily on what context we send to the LLM. Right now the context is limited to everything in the current before before (point), so the quality will not be good. We will fix this soon, once I start working on the context UI. We can use LSP/eglot, imenu and project.el to find a reasonable amount of context to include with requests.
I've started a discussion for this feature, please use that to respond -- this page is only an announcement, it will be locked.
Further details on how to use
gptel-complete
are in the discussion page.